Sept 2018 - Jan 2019
Group project with Rachel Felber (UI/UX), Anna Jensen (Research), Michael Morton (AR/Research) & Dana Ventre (Research)
Duration: 4 months
My role: UX/UI design (mobile), user research, strategy, interviews, AR feature design
Software Used: Sketch, Adobe XD, Unity, Invision
Goal: Assist current and would-be coffee drinkers to easily find, explore and purchase artisanal coffee using traditional mobile and innovative AR features to scan and interact with coffee labels and brands.
Process: Agile and comprehensive starting from our own design challenge, we quickly did everything from exploration and ideation, user flow diagrams and personas, paper wireframes and prototypes, user research, interviews, and testing, followed by UI and AR design.
Personal Approach: Focused on finding patterns and common needs between expert users and beginners alike in order to design key features that would delight all users.
Take a look at a our most distinguishing features:
Onboarding Tastes Quiz
Scan and Augmented Reality
Detailed Coffee Profiles
User Research & Exploration
We employed 4 research methods to understand opportunities, current products, users, and pain points:
Stakeholder maps (with Project Brief)
Empathy maps & personas
We interviewed 12 coffee drinkers in a high-end Dallas cafe ranging from experts to beginners in order to understand user pain points. We then ideated on how we may most effectively meet these needs within our application. The key pains we identified were: burden of choice, intimidation in transitioning to more artisanal coffee, and exploration viewed as expensive and time consuming.
We completed rapid sketches, wireframes, and prototype iterations using workflows and user storiesWe adapted our workflows from low-fidelity sketches to high-fidelity wireframes and design.
Multiple rounds of user interviews were conducted and design iterations were modified based upon user testing.
In user testing, we conducted 9 interviews in total, each approximately 20 minutes. Primarily, we were interested in initial impressions, expectations, and user-driven scenarios. We used black and white paper prototypes and recorded our interviews using Ottr.
Results & Next Steps
Below is a summary of our most prevalent findings in order from least to most severe:
Participants didn’t know the favorite icon (heart) was tappable because it was already filled in with a color.
Not everyone thought flavor tags were tappable.
Few participants understood the meaning of ‘features’ in the filtering section.
Some participants were unsure if the scan feature would automatically take a photo of a coffee bag.
None of the participants understood the purpose of the AR wand icon everyone thought it was some sort of editing tool.
After incorporating these changes and reiterating our prototype, we tested and re-iterated once more. Finally, we were required to create a 20 minute presentation detailing our process to the rest of the class and staff members. Using Unity, we were able to successfully demonstrate our scan and AR feature to create that "wow" moment among the audience, a feeling I've been chasing ever since!