Recip.ai

Recip.ai is an AI-informed recipe generation mobile app concept designed to help home cooks use up excess ingredients before they spoil. After an initial 3-week team design sprint, we had a strong concept, but the designs were facing usability issues that led to a decline in user satisfaction and engagement.

My goal was to test, evaluate, and iterate upon the original designs to provide a seamless, intuitive, and delightful experience for users, leading to improved satisfaction and higher conversion rates.

Recip.ai

Recip.ai is an AI-informed recipe generation mobile app concept designed to help home cooks use up excess ingredients before they spoil. After an initial 3-week team design sprint, we had a strong concept, but the designs were facing usability issues that led to a decline in user satisfaction and engagement.

My goal was to test, evaluate, and iterate upon the original designs to provide a seamless, intuitive, and delightful experience for users, leading to improved satisfaction and higher conversion rates.

Team (4 UX Designers)

Team (4 UX Designers)

Xavier Talwatte, Priscilla Huff, Shaina Prasad, Dillon Marks

Role

Role

UX/UI Designer

Timeline

Timeline

3-week team design sprint
3-week individual iterations

Challenge

Recip.ai had a complicated pantry interface inspired by competing products. This original design made the app difficult to use on the go, and the user flow was confusing and time-consuming. The app also had several technical feasibility issues, making it impossible for developers to build.

Results

The redesigned app features a clean, clutter-free interface, and a streamlined user flow, making it easier for users to navigate and access essential features.

The improved interface informed by user research resulted in a 166% increase in conversion rates.

The addition of a revamped ingredient selection flow enhanced ease and convenience of use, leading to a 55% reduced total cycle time and an improvement of 79.6% in user satisfaction.

+166%

+166%

Improved Conversion Rate

55%

55%

Reduced Total Cycle Time

+79.6%

+79.6%

Improved User Satisfaction

Process: Original Designs

Problem Exploration:

We conducted secondary research, user interviews, and surveys to understand user needs, pain points, and current solutions. We also studied competitor apps and industry trends to gather insights.

How might we empower home cooks to make use of their excess ingredients before they spoil?

Key Findings:

  • Users don't always have enough time or patience to find recipes online.

  • Coming up with creative ways to use excess ingredients is difficult.

  • Users don't want to keep an inventory of the items they have at home.

  • Most home cooks use the internet as a part of their meal-planning process.

Problem Exploration:

We conducted secondary research, user interviews, and surveys to understand user needs, pain points, and current solutions. We also studied competitor apps and industry trends to gather insights.

How might we empower home cooks to make use of their excess ingredients before they spoil?

Key Findings:

  • Users don't always have enough time or patience to find recipes online.

  • Coming up with creative ways to use excess ingredients is difficult.

  • Users don't want to keep an inventory of the items they have at home.

  • Most home cooks use the internet as a part of their meal-planning process.

Ideation:

Based on our research findings, we used brainstorming activities like Crazy Eights and How Might We to ensure we had a wealth of ideas to pull from, prioritizing features according to user needs.

Creating Crazy Eights sketches helped us keep an open mind while ideating potential solutions. Initial ideas were intentionally unrefined to encourage creativity.

Creating Crazy Eights sketches helped us keep an open mind while ideating potential solutions. Initial ideas were intentionally unrefined to encourage creativity.

Creating Crazy Eights sketches helped us keep an open mind while ideating potential solutions. Initial ideas were intentionally unrefined to encourage creativity.

I created paper sketches with multiple versions of key screens to quickly generate, communicate, and refine ideas to share with my team.

I created paper sketches with multiple versions of key screens to quickly generate, communicate, and refine ideas to share with my team.

I created paper sketches with multiple versions of key screens to quickly generate, communicate, and refine ideas to share with my team.

Design, Refine, Combine:

We each individually designed low-fidelity wireframes to visualize four different approaches based on a singular flow, combining features from each via dot voting.

Establishing Success Criteria:

In order to evaluate the performance of our designs, we established KPIs to use as success criteria.

  • User satisfaction

  • Total cycle time

  • Conversion rate

We selected these KPIs because we wanted to make sure that our product would be useful and convenient for users; hopefully, this would help us build an app people would use.

Usability Testing:

We conducted usability tests to identify areas for improvement. Over two rounds of remote, moderated usability testing, we simplified login options, reducing error rates by 75%. We also resolved several p0 insights and found a 25% decrease in reported confusion related to the pantry page.

Outcomes:

We developed a refined low-fidelity prototype for our concept that showed promise in terms of desirability; many usability study participants liked the idea and shared they would be likely to use the app if convenience and ease of use were improved. However, a complicated pantry interface and unclear user flow made the app inefficient and confusing.

The original flow featured 6 key screens where a user could add items to a digital pantry and receive recipe recommendations based on ingredients they had at home. If they were missing any ingredients, they could generate a shopping list for missing ingredients.

Process: Revised Designs

KPI Assessment:

After concluding the 3-week design sprint, I evaluated the performance of our designs by quickly conducting usability tests with 11 participants and identified key areas for improvement:

  • The user flow was overly complicated, resulting in long total cycle times. (2:25)

  • Users didn't like the pantry page, resulting in lower satisfaction with the product. (49%)

  • Conversions struggled because of convenience, level of fidelity, and confusion. (27%)

Revisiting Competitor Analysis:

Our original competitor analysis focused on what we could learn from the successes and failures of competing products, but after the KPI assessment, I had new doubts about some competitor features and wanted to re-evaluate them.

Key Findings:

  • Recipe-finder products were heavily reliant upon databases and struggled to find recipes as users added more ingredients.

  • Using AI tools to manually generate recipes is effective but time-consuming. Precise prompts are required for adequate information, and the interface is not user-friendly.

  • Non-AI competitors fail the Katta Sambol Test.

The Katta Sambol Test:

One of the unique weaknesses of recipe-finder products is that if they don't have your ingredient in their database, you're toast! (pun intended). To stress-test these competing products, I found a unique ingredient from my parents' fridge. Katta Sambol, a Sri Lankan condiment with a spicy, tangy flavor, seemed like the perfect test. This test helped validate Recip.ai as a concept. Seeing it succeed in cases where others failed assured me that it was on the right track.

Katta Sambol!

Recipe finder competitors (Tasty, SuperCook, and BigOven) failing the Katta Sambol Test.

Key recipe finder competitors (Tasty, SuperCook, and BigOven) all failed the Katta Sambol Test.

Key Takeaways:

  • Validation of concept and unique value proposition.

  • Competitors struggle with unique ingredients.

  • Recip.ai would not rely upon databases for recipes, and some things can't be built the same way as competitors.

Challenge 1:

Initially, we assumed that users would be at home while using the app. This is evident through the layout of the pantry page (inspired by competing products), which prompts users to input all of their available ingredients. Users won't always be at home to check their available ingredients.

I tackled this problem by stepping away from my fridge (and computer screen) to visit a local park. There, I interviewed 5 people to better understand how users plan dinner when away from home.

"It's usually 3 ingredients for me: vegetables, carbs, and protein."

-Anonymous interviewee from the park

People only remembered 2-4 ingredients they had at home. These findings are reinforced by Miller's law, which states that most people can only keep seven (plus or minus two) items in their working memory.

"It's usually 3 ingredients for me: vegetables, carbs, and protein."

-Anonymous interviewee from the park


People only remembered 2-4 ingredients they had at home. These findings are reinforced by Miller's law, which states that most people can only keep seven (plus or minus two) items in their working memory.

"It's usually 3 ingredients for me: vegetables, carbs, and protein."

-Anonymous interviewee from the park


People only remembered 2-4 ingredients they had at home. These findings are reinforced by Miller's law, which states that most people can only keep seven (plus or minus two) items in their working memory.

Original designs prompted users to enter all of their ingredients from home to create a digital pantry, resulting in:

  • A confusing user flow

  • Longer task completion times

  • Usability issues away from home

Before

Revised designs informed by interview insights feature several text input fields for adding ingredients. These changes resulted in:

  • An intuitive user flow

  • Faster task completion times

  • An interface optimized for users on the go

After

Original designs prompted users to enter all of their ingredients from home to create a digital pantry, resulting in:

  • A confusing user flow

  • Longer task completion times

  • Usability issues away from home

Revised designs informed by interview insights feature several text input fields for adding ingredients. These changes resulted in:

  • An intuitive user flow

  • Faster task completion times

  • An interface optimized for users on the go

Challenge 2:

Our recipe card design was based on the assumption that we would have a database of images to pull from. However, Recip.ai would not work this way, since it generates recipes on the spot.

I tackled this problem by interviewing people at a local 10k to understand what information people prioritize when deciding what recipe to cook.

  • Preparation Time

  • Difficulty Level

  • Calorie Count

Card Anatomy

Card Anatomy

Card Anatomy

The recipe card component contains seven elements within four sections.

Before

Original designs struggled with feasibility due to a reliance on recipe images presumably sourced from a database.

After

Revised designs informed by interview insights resolve feasibility issues and feature key information users want to know:

  • Ingredients from user input

  • Prep time

  • Difficulty level

  • Calorie count

Original designs struggled with feasibility due to a reliance on recipe images presumably sourced from a database.

Revised designs informed by interview insights resolve feasibility issues and feature key information users want to know:

  • Ingredients from user input

  • Prep time

  • Difficulty level

  • Calorie count

Visual Design:

I created a style guide to ensure consistency and intentionality behind each visual design decision.

Final designs follow an eight-point grid with consistent spacing rules throughout.

Typography guidelines inform hierarchy, clarity, and cohesion.

The color palette ensures clear, specific uses for each color.

Final Designs:

“I would definitely use this. I've been wanting something like this to exist for years.”

-Ano Nymous (Usability Test Participant)

Conclusion

The Recip.AI app redesign successfully addressed existing usability issues, resulting in a more intuitive and user-friendly experience. The improved designs resulted in the following changes for KPIs:

  • Conversion rate increased from 27% to 72%, a 166% improvement.

  • Total cycle time decreased from an average of 2:25 to 1:05, reducing completion time by 55%.

  • Average satisfaction scores increased from 49% to 88%, a 79.6% improvement.

While I'm proud of these improvements, there is always room to grow, and if I were to do this project over, there are a few things I'd do differently:

  1. Talk to users sooner. I didn't understand why the original pantry design was confusing participants. Talking with prospective users to better understand their processes for planning and preparing meals likely would have led me right to the issue.

  2. Be more critical of competitors. I was originally enamored with aspects of competing products that I saw as robust features; however, looking back, these features were designed to solve different problems and were counterproductive Recip.AI's goals.

  3. Acknowledging the impact of constraints. If I had the opportunity, I would use additional testing methods to understand issues I didn't attempt to solve in this project. One issue participants shared was a lack of trust in AI. Some also shared doubts about the quality of AI-generated recipes. Determining how to increase confidence related to these issues would certainly have strengthened this project.

Thank you for taking the time to read this case study!

Xavier Talwatte | UX Designer

Xavier Talwatte | UX Designer