Google Shopping now uses AI to model clothes on different body types
Google is introducing two new features to its online shopping experience designed to help users search for clothes in more detail and better visualize how clothes will look on different body types.
Starting today, Google Shopping users in the US can access a virtual trial experience that realistically depicts what a piece of clothing will look like on a selection of real human models. These models are available with different skin tones, ethnicities, hair types and body shapes ranging in size between XXS and 4XL to help users see how a piece of clothing will look on a body type similar to their own.
Initially, only women’s tops from a selection of brands such as H&M, Anthropologie, Everlane and Loft will be available for the virtual try-on experience, with Google claiming men’s tops and “other apparel” will be available sometime later this year. The feature is designed to help customers avoid disappointment by accurately visualizing what a piece of clothing will look like before purchasing it. The company, citing its own shopping data, claims that 59 percent of online shoppers are disappointed with a clothing purchase because they expected it to look different on their body, and 42 percent feel unrepresented by online clothing models.
Google Shopping’s new virtual try-on experience uses a diffusion-based generative AI model, which is trained by adding Gaussian noise to an image (essentially random pixels) that the model then learns to remove to create realistic images. to generate. The process enables Google’s AI model to realistically represent how a piece of clothing would crease, drape, fold, cling and stretch on the available range of various models, regardless of the angle or position they are at. To be clear, the models for Google Shopping are not AI-generated – AI is simply used to shape the clothes around images of these human models.
New filters are also launching in Google Shopping today that are designed to help users find exactly what they’re looking for, such as a similar but cheaper alternative to a shirt or jacket with a different pattern. Machine learning and visual matching algorithms allow users to refine inputs such as color, style and patterns across various online clothing stores to find an item that best suits their requirements. The feature is now available in product listings on Google Shopping and is likewise limited to tops – Google hasn’t said when it will expand to other types of clothing.
Levi’s similarly announced in March that it was using AI to expand its online shopping modeling options. However, instead of using images from real people like Google, the denim brand said it would test using AI-generated models, initially claiming it would help “diversify” the denim company’s shopping experience. Levi’s later walked back those post-announcement comments were met with backlash, but claimed the use of AI-generated models would allow the brand to “more quickly publish more images of our products on a range of body types”.