/ Blog

Google, Zalando, and Snapchat All Want to Replace Your Fitting Room. None of Them Have.

Between 20 and 40 percent of everything bought online gets returned. For clothing, half of those returns are fit-related. Not wrong color. Not changed mind. The medium was not a medium.

Half of online shoppers now buy multiple sizes expecting to return the rest. The industry calls this “bracketing” — as if naming a failure makes it a feature.

Three of the largest tech and retail companies have each spent years building their version of virtual try-on. They built three fundamentally different things.

To test them, we ran the same person through every tool we could access — same pose, same base outfit, same lighting. This is her.

AI-generated reference photo of a woman in a black tank top and dark jeans, used as consistent input across all virtual try-on tools tested
Our test subject: an AI-generated reference photo (Gemini Pro) designed to match the neutral pose and plain background that virtual try-on tools expect. Same image, every tool.

The landscape at a glance

Twenty-one providers. Twelve dimensions. One pattern: look at the Fit column.

Score:12345678910n/a

Columns: Ease = ease of use | Fit = fit prediction | Vis = visual realism | You = preserves your appearance | Det = garment detail | Spd = speed | Cat = category coverage | Bod = body diversity | Pos = pose flexibility | Acc = platform access | Scl = catalog scale | Ret = proven return reduction

Big tech

EaseFitVisYouDetSpdCatBodPosAccSclRet
Google Shopping8187656958102
Snapchat AR716548755555
Amazon636469535572
Pinterest714449333552
Apple Vision Pro115255113111

Retailers and startups

EaseFitVisYouDetSpdCatBodPosAccSclRet
Zalando484345253527
AIUTA (ASOS)626756575452
Zelig637565665555
Fytted576655564566
Doji526655454441
Genlook715545343533
Clad†716756855581

Clad publishes this blog. Scores self-assessed using the same rubric applied to all other providers.

Luxury and regional

EaseFitVisYouDetSpdCatBodPosAccSclRet
Gucci517568335421
Cartier517467224411
Burberry717757263621
Lenskart766657264677
Ably646655554565
ZOZO383334673575

Size and measurement specialists

EaseFitVisYouDetSpdCatBodPosAccSclRet
True Fit88977799
Fit Analytics88977787
Virtusize67755545
3DLOOK573435473446

The tools with the highest fit scores show you nothing. The tools with the best visuals tell you nothing about fit. Nobody does both.

Google: the prettiest picture in the room

Google’s virtual try-on launched June 2023 with women’s tops from a handful of brands. By late 2025, you could upload a selfie and see an AI-generated full-body image of yourself in nearly anything across 50 billion product listings.

TODO: screenshot of Google Shopping virtual try-on results
TODO: screenshot — search for a top on Google Shopping and use the try-on feature.

The technology uses diffusion-based AI — same family as image generators — trained on clothing-body image pairs. It understands, to a meaningful degree, how fabrics stretch, fold, and catch light. At launch, it showed garments across ~80 real models spanning XXS to 4XL, multiple skin tones and body shapes. In December 2025, it added full-body generation from just a selfie.

TODO: screenshot of Google's selfie-based try-on
TODO: screenshot — upload a selfie and show the generated result vs the original photo.

What it does well

Scale and access. The try-on lives inside Google Search — no app, no account, no friction. Listings with try-on get ~60% more engagement.1 The diversity of model representation is genuine and ahead of competitors.

What it cannot do

Tell you whether the garment fits. This is not a minor caveat. Google’s system has no concept of your measurements. It cannot predict shoulder seam placement, waistband fit, or sleeve tightness. It is a visualization tool, not a fit tool.

Google, to its credit, says “see how it looks” — not “see how it fits.” The gap between those phrases contains the entire problem.

Hands-on testing reveals uncanny valley moments — face artifacts, AI-generated legs that look slightly off, complex patterns turning to visual noise.2 And strategically, this is Google’s response to losing fashion commerce to TikTok, Amazon, and Instagram — not primarily a gift to shoppers.3

Snapchat: fun but not fit

Snapchat overlays digital garments onto your live camera feed in real time. Same tech as face filters, extended to the body.

TODO: screenshot of Snapchat AR try-on lens
TODO: screenshot — open a brand's Snapchat try-on lens for shoes or eyewear.

The numbers are large: 250+ million Snapchatters have engaged with AR shopping lenses over five billion times. Ulta reported $6M in incremental sales from a single two-week campaign.4 Gucci, Prada, Dior, and Fendi have all built on the platform.

Where it works

Accessories and beauty. When the product sits on a flat, trackable surface — your face for glasses, your feet for shoes — the overlay is convincing. Eyewear try-on adoption among major retailers has reached 70–80%.5 The social loop (try on, snap to a friend, get opinion) is a real differentiator no other platform has.

Where it doesn’t

Clothing. Fabric drapes, stretches, bunches — an AR overlay cannot simulate this. The camera doesn’t know your body shape. A size 14 and a size 4 see the same overlay.6

Snap tried selling this tech to retailers as a standalone service (ARES). Launched March 2023, shut down six months later — too complex, not engaging enough for real retail contexts.7

Zalando: closest to useful, furthest from scale

Zalando builds a 3D avatar from your body measurements and simulates how a garment sits on it, using a color-coded heatmap showing where it fits tight and where it’s loose.

TODO: screenshot of Zalando's virtual fitting room heatmap
TODO: screenshot — use Zalando's virtual fitting room on a pair of jeans and show the fit heatmap.

This is the only major approach that attempts to communicate actual fit — not just appearance.

What it does differently

The team deliberately chose not to show customers’ faces on the avatar — avoiding the self-perception problems of looking at a digital version of yourself.8 The visualization prioritizes whether the garment fits, not how you look in it.

Zalando also runs a size recommendation system trained on millions of purchase-and-return data points. Size flags (“runs small — size up”) reduced size-related returns ~10%.9 Pilot results: up to 40% fewer returns for items where the fitting room was used.10

The caveats

That pilot had ~30,000 participants. Zalando has 50 million customers. By late 2024, cumulative usage was ~80,000 users. The fitting room covers a curated subset — some jeans, some tops. Not dresses, outerwear, or footwear.

Rollout to all customers planned for 2026 — three years after the pilot. If 40% return reduction held universally, you’d expect faster deployment. Meanwhile, Zalando shortened its return window from 100 to 30 days. The carrot and the stick, simultaneously.

Everyone else

TODO: collage of startup virtual try-on interfaces
TODO: screenshot collage — try Zelig on REVOLVE, AIUTA on ASOS, Doji app, Genlook Shopify plugin.

Amazon has AR shoe and eyewear try-on but killed its physical Try Before You Buy program in 2025, citing AI tools — then failed to ship a clothing alternative. Pinterest does lipstick and eyeshadow. Apple has done almost nothing despite owning LiDAR hardware.

Startups: Zelig (on REVOLVE, claims 3x conversion). Fytted (50 body measurements from phone camera). AIUTA (on ASOS, upload photo or pick from 20 AI models). Doji ($14M raised, persistent avatar app). Clad (disclosure: ours — any product from any store via URL or browser extension; image-to-image via Vertex AI; early stage). On the B2B side, Clad works directly with brands — send garment design files, get physics-based fit simulation for your catalog. Brands can reach out at hello@clad.you.

Luxury brands treat try-on as brand experience — Gucci’s markerless AR watch, Cartier’s virtual Pont Alexandre III bridge. None helps you figure out whether a jacket fits.

Regional standouts: Lenskart (India) reports 35–40% conversion among try-on users. Ably (Korea) saw 5x sales jump in two months. ZOZO (Japan) mails you a physical measurement bodysuit.

Open source is catching up. CatVTON runs on a consumer GPU with <8GB memory. Genlook offers try-on as a Shopify plugin with a free tier. What required an enterprise Snap contract two years ago now costs nothing.

What none of them solve

No current system can tell you:

  • Whether the fabric digs into your waist
  • Whether the shoulder seams sit correctly on your frame
  • Whether the sleeves are too tight when you raise your arms
  • Whether the waistband rolls or stays put
  • Whether the fabric is sheer in direct light
  • How the garment moves when you walk
  • How it feels after eight hours

When you grade every product on fit prediction — not visual appearance, but dimensional fit — nearly all score 1 out of 5. Google: 1. Snapchat: 1. Amazon: 1. Every diffusion-based startup: 1. The only systems that score meaningfully on fit are the ones that show you no picture at all — size recommendation engines like True Fit.11

An estimated 15% of online returns are driven by fabric texture disappointment — a variable no image can address.12

Virtual try-on is weakest in the exact category where returns cost the most. Eyewear adoption is high because eyewear has no fit variable. Beauty works because color matching is solvable. Apparel — where the problem is most acute — is where every approach falls shortest.

The number that tells the real story

Only 1.4% of adults aged 18–65 regularly use any form of virtual try-on.13

After years of investment. Billions in development. Countless press releases about the future of shopping.

The industry points to satisfaction scores among existing users. But satisfaction among the self-selected, tech-comfortable few doesn’t address the adoption failure. A UX study across four brands found seven significant usability issues — camera failures with no explanation, inconsistent availability, no comparison tools. Only 28% said it helped.14

The body image problem

Iowa State researchers analyzed data from 8,000+ customers. Virtual fitting rooms increased sales among low-BMI shoppers. For high-BMI shoppers, sales dropped and self-esteem declined.15

The effect was stronger than in physical fitting rooms. Over 60% of American women wear plus-size clothing. The technology actively works against the largest consumer segment.

Meanwhile, AI training data for these systems overwhelmingly features youth (100%), thinness (87.5%), and revealing clothing (87.5%).16 The systems don’t create beauty standards — they reproduce and scale them.

Upload a photo to virtual try-on. The system captures your facial geometry, skin tone, facial landmarks, sometimes body measurements. Under Illinois BIPA, that’s biometric information requiring written consent and a retention policy.

Since 2021: 15+ class action lawsuits. Charlotte Tilbury settled for ~$3M. Amazon’s class certification affirmed late 2024. Louis Vuitton’s motion to dismiss rejected.17

Body data feels different from browsing history. One user in a Retail TouchPoints investigation abandoned the feature because setup felt uncomfortably close to a nude selfie.18

What would actually help

The fitting room provides fit information grounded in your specific body meeting a specific garment. Not a picture of a garment on a picture of you. Actual data: this garment, in this size, sits this way on your body.

That requires two things with precision: your body dimensions and the garment’s dimensions. Size recommendation engines — True Fit, Fit Analytics, Virtusize — have shown 14–50% return reductions without any image.19 Just accurate measurements on both sides.

The technology that solves the fitting room problem is probably not a better image. It’s a better measurement — combined with a simulation honest enough to tell you where fabric touches skin and where it doesn’t.

TODO: side-by-side comparison of the three approaches
TODO: screenshot — same garment through Google (image), Snapchat (AR), and Zalando (heatmap).

The pieces exist in fragments. CLO3D and Browzwear simulate garment physics for designers. 3DLOOK extracts body measurements from two photos. Neural networks simulate fabric draping 100x faster than traditional physics engines.20 Open-source models like MC-VTON need just 8 inference steps.21

A few teams are attempting some version of this assembly. We are one of them — Clad started as an image-to-image try-on and is building toward physics-based fit simulation: body mesh from measurements, garment draped with pressure mapping. Think Zalando’s approach — heatmaps, dimensional fit, honest about where it’s tight — but available to any brand on any platform, not just Zalando’s own catalog. And wrapped in something a shopper might actually enjoy using. A working prototype generates 3D body models from a questionnaire and a photo. The gap between that prototype and something a shopper would trust is considerable, and it is the same gap every team in this space faces. Our particular focus is non-standard sizing — plus-size, petite, adaptive — where fit uncertainty is highest and where existing visual tools fail hardest.

Assembling these pieces into something usable on a phone — without uploading intimate data, without an avatar that harms self-image, without 10 minutes of setup — is the actual engineering problem. Harder than generating a pretty picture. Which is probably why fewer companies are working on it.

The fitting room is still the fitting room

The online-to-in-store apparel return gap — 22% vs 7% — has persisted through years of investment.22 The technology improved. The gap didn’t close.

A physical fitting room answers questions you didn’t know to ask. The way a collar sits. The weight of fabric on your shoulder. Whether you can sit down in these pants. Instantaneous, embodied, no technology required beyond a mirror and adequate lighting.

The pivot from “returns reducer” to “styling inspiration tool” — happening across the industry — is the most honest signal.23 When a startup reframes try-on as outfit exploration rather than return prevention, it’s acknowledging the original pitch was harder to prove than expected.

Virtual try-on is not pointless. A better product image is still a better product image. AR sunglasses try-on is genuinely helpful. Size recommendation systems quietly prevent millions of wrong-size orders.

But the claim that any tool replaces the fitting room — that you can know, from a screen, whether a garment fits your body — remains unfulfilled.

The fitting room is still the fitting room. Everything else is a picture.

Frequently asked questions

Does virtual try-on actually work for clothing?

Depends on what “work” means. For seeing how a garment looks on a body shaped like yours, Google’s approach produces useful images. For eyewear or makeup, Snapchat and similar AR tools are mature — eyewear adoption has reached 70–80% among major retailers.

For predicting whether a garment will fit — the question that drives most returns — almost no tool works. Only measurement-based systems like Zalando’s and size recommendation engines like True Fit attempt an answer, with limited coverage and adoption.

Which virtual try-on is the most accurate?

Visual realism: Google. Fit prediction: Zalando and size specialists (True Fit, Fit Analytics, Virtusize). No single tool excels at both.

Why are online clothing return rates still so high?

20–40% of online clothing gets returned, ~half for fit. Virtual try-on hasn’t closed this gap because most approaches prioritize visual appearance over fit prediction. The 22% vs 7% return rate gap has persisted. Only 1.4% of adults regularly use any try-on tool.

Is virtual try-on safe? What happens to my body data?

Tools capture facial geometry, skin tone, and sometimes body measurements — biometric data under laws like Illinois BIPA. 15+ lawsuits since 2021, settlements reaching ~$3M. Check privacy policies before uploading.

What is the difference between virtual try-on and size recommendation?

Virtual try-on generates a picture. Size recommendation predicts a size from measurements — no image. Size recommendation has demonstrated 14–50% return reductions. Most visual try-on hasn’t matched that. The ideal combines both. No tool at scale does this yet.

Can I use virtual try-on with any clothing store?

Most tools are locked to one retailer. Zalando’s works on Zalando. Google’s works on Google Shopping. A few tools — including Clad — are store-agnostic: paste a product URL or send a garment image from any online store. The trade-off is that retailer-integrated tools access richer product data (size charts, garment dimensions), while store-agnostic tools rely on whatever the user provides.

Does virtual try-on work for plus-size, petite, or non-standard sizing?

Poorly, in most cases. AI training data skews toward thin, young bodies. Fit prediction is least reliable where size variation is greatest — which is exactly where it matters most. Zalando’s avatar supports a range of body shapes but covers a limited catalog. Physics-based simulation — a 3D garment draped on a body model built from your measurements — is the most promising approach for non-standard sizing, but no tool delivers this at consumer scale yet.

References


  1. eMarketer, September 2024: virtual try-on listings receive approximately 60% more high-quality views than standard product listings. ↩︎

  2. How-To Geek review of Google’s virtual try-on, noting pattern rendering issues and uncanny valley artifacts with AI-generated body imagery. ↩︎

  3. Modern Retail, 2023: commerce consultant framing Google’s try-on launch as a response to TikTok’s visual commerce advantage. ↩︎

  4. Snap for Business reporting: Ulta Beauty Snapchat AR campaign results, 2023. ↩︎

  5. Industry data on eyewear virtual try-on adoption rates among major online retailers, 2024. ↩︎

  6. Shanghai Garment industry analysis on limitations of visual virtual try-on for fit assessment. ↩︎

  7. TechCrunch, September 2023: Snap shutters ARES enterprise services division. ↩︎

  8. Zalando Design team documentation on avatar design philosophy, published on Medium. ↩︎

  9. Zalando Research: SFNET personalized size recommendation system, developed collaboratively with Shopify and OLX. ↩︎

  10. Zalando Corporate, April 2023: virtual fitting room pilot results across 25 European markets. ↩︎

  11. Feature comparison grading across 22 major virtual try-on players, February 2026. Fit accuracy scored 1-5 based on dimensional prediction capability. ↩︎

  12. Industry estimates on returns driven by fabric texture disappointment, cited in multiple fashion technology analyses. ↩︎

  13. eMarketer survey, October 2024: 1.4% of adults aged 18-65 regularly use any form of virtual try-on. ↩︎

  14. Netguru UX research study: 21-day test across 18 users, 4 brands, and 4 categories. ↩︎

  15. Iowa State University, April 2023: six studies analyzing data from 8,000+ customers on virtual fitting rooms and body image effects. ↩︎

  16. PsyPost analysis of AI-generated female images in training data: youth (100%), thinness (87.5%), revealing clothing (87.5%). ↩︎

  17. ClassAction.org: Charlotte Tilbury BIPA settlement ($2.925M, February 2025). Duane Morris: Amazon BIPA class certification affirmed December 2024. ↩︎

  18. Retail TouchPoints, September 2024: user experience documentation of virtual try-on adoption barriers. ↩︎

  19. ASOS and Foot Locker reported 14% return reduction with Fit Analytics; True Fit reports up to 50% reduction for single-brand DTC retailers; YNAP reported 25% return reduction with 28% conversion increase. ↩︎

  20. Neural Cloth Simulation (ACM TOG, 2022); GAPS (3DV, 2024). Neural methods achieve 10-100x speedup over traditional physics simulation. ↩︎

  21. MC-VTON (arXiv, January 2025): 39.7M additional parameters (0.33% of backbone), 8 inference steps. CatVTON (ICLR 2025): runs at under 8GB VRAM at 1024x768. ↩︎

  22. ICSC, March 2024: online apparel return rate of 22% versus 7% in-store. ↩︎

  23. Business of Fashion, 2024: coverage of industry pivot from “returns reducer” to “styling tool” positioning for virtual try-on products. ↩︎

← Back to blog