A mobile application powered by AI that allows users to "try-on" clothing, shoes, and accessories online using their smartphone camera. It designed to give shoppers confidence, save time, and reduce unnecessary returns.
Type
Startup
Position
Product Designer
Duration
05/2025 - 08/2025
Tools
Figma, Notion
My role & Responsibilities
Product Designer - full design cycle
Research & Discovery
To uncover real pain points:
Conducted user research
Competitor analysis
Structure & Strategy
To define product structure:
Built information architecture
User flows
Wireframes
Design & Prototyping
To bring ideas to life:
Designed UI
Mockups
Interactive prototypes
Communication
To communicate the product’s value:
Created presentation website
Collaboration
I collaborated closely with our lead product designer and the CEO, aligning every design decision with both user needs and the startup’s business vision.
Outcome
The result was a clear product concept and prototype that the team believes represents the future of online shopping - a smarter, more confident way to choose clothes online.
Goal
Shoppers face significant uncertainty when buying clothes online. They struggle to imagine how an item will look on their own body, have little guidance on which cuts or styles suit them personally, and feel unsure about choosing colors that match their preferences and appearance. This lack of confidence often leads to hesitation, abandoned carts, or unsatisfying purchases.
Solution
The goal was to help shoppers make confident purchase decisions by allowing them to visualize how clothing fits their body, discover styles and cuts that match their individual shape, and explore color combinations that feel harmonious. By reducing uncertainty, the product aims to increase user satisfaction and lower cart abandonment rates.
01 step
I reviewed five fashion and AR try-on apps to evaluate:
Usability
Feature sets
Pricing models
Key findings
Most apps cover only limited product categories (mainly sneakers or accessories)
Apps tend to emphasize either AR try-on or photo-based styling, but rarely combine both
FitRoom and StyleDNA stood out with broader product integration
Other apps focused on simplicity and free access
Insights for design
Expand product variety beyond single categories
Improve onboarding clarity
Balance free vs. paid features
02 step
Making hypothesis
To ground the design process in user needs, I began by creating a persona, Kate, and mapping out her online shopping journey to highlight key frustrations and unmet needs. This helped generate initial hypotheses for possible solutions.
Looking for evidence
User research:
11 user interviews across different shopping habits and demographics
Gathered insights into real behaviours and expectations
Outcomes:
Refined early assumptions
Identified what users truly want in an online shopping experience
Ensured design decisions were based on evidence, not assumptions
Uncertanity about sizing
Hard to visualise fit
Lack of clarity on personal style
Colour confidence issues
03 step
Ideation
Explored a broad set of possibilities:
Size storage in user profiles
AR try-on with styling overlays
Multi-size virtual try-on
Interactive AI assistant
Goal:
Think divergent
Cover the widest range of solutions
Validation
Collaborated with developers and the CEO
Technical feedback:
Full-body AR and storage-heavy features had performance limits
Business feedback:
Prioritised quick adoption and broad device compatibility
Insights helped refine ideas into feasible, user-friendly solutions
Final features
AR Try-on
(Real-time outfit visualization with capture, save, and share options, plus instant AI feedback)
Photo Try-on
(Upload a photo to generate outfits, accessible without AR)
AI assistant
(Personalized advice on body type, color coordination, style, and motivational support)
04 step
Information architecture
With the three core features defined, I moved into structuring the product.
Browsing products
Using AR / Photo Try-on
Interacting with the AI assistant
Wireframing
Based on the IA, I developed mid-fidelity wireframes in Figma to visualize screen layouts and interactions without the distraction of final visuals.
AR try-on flow
Photo try-on flow
AI assistant flow
Using mid-fidelity allowed me to quickly share concepts with stakeholders, gather feedback, and iterate before moving into visual design.
05 step
Feedback
Focused on technical feasibility and business alignment
Refinements
Improved flows for AR try-on, photo upload, and AI assistant → more intuitive for users and realistic to implement
Outcome
Confidence to move forward into high-fidelity design
06 step
AR try-on flow
Problem
Online shoppers struggle to imagine how clothes will actually look on their bodies. Most try-on apps only cover sneakers or accessories, leaving outfit visualisation limited and often unrealistic.
Solution
A real-time AR try-on that lets users instantly see outfits on themselves. They can capture, save, or share photos and even request instant AI feedback - making the experience both interactive and confidence-boosting.
Photo try-on flow
Problem
Not all users have devices that support AR, or they may want to try outfits quickly without using the camera. This creates a barrier to adoption.
Solution
A photo upload try-on, where users can upload a picture of themselves and generate how outfits would look. This makes the experience accessible to all users, across any device, without sacrificing personalization.
AI assistant flow
Problem
Shoppers often feel uncertain about size, fit, and style choices. They may know what they like but lack guidance on body type, color coordination, or styling confidence.
Solution
An AI-powered assistant that offers personalized advice on size, fit, color matching, and outfit style. Beyond functional tips, it also provides motivational support, reducing decision fatigue and making shopping feel more enjoyable.
07 step
Goals
Communicate the product’s value and core features in a clear, engaging way
Create trust and excitement around the product vision
Provide a simple entry point for early adopters and testers
Design focus
Concise storytelling of the problem and solution
Visual consistency with the app’s branding and UI
Easy navigation to highlight AR Try-on, Photo Try-on, and AI Assistant features
Outcome
The website served as both a marketing tool and a communication asset, helping align stakeholders and attract early interest in the product.
08 step
At the end of the entire development and testing process, all the new interface components were successfully integrated into the design system. This allowed for the creation of a unified, standardized foundation for further development and maintenance of the project.
09 step