Awarded Most Innovative Project - InnoQuest, IU

Awarded Most Innovative Project - InnoQuest, IU

Awarded Most Innovative Project - InnoQuest, IU

Async portfolio reviews are broken

Async portfolio reviews are broken

Async portfolio reviews are broken

Async portfolio reviews are broken

Async portfolio reviews are broken

Async portfolio reviews are broken

TLDR;

problem

The current way of doing async portfolio reviews is broken for feedback seekers (mostly junior designers) because they deal with scattered feedback, confusing low-context feedbacks and manual work of analyzing multiple feedbacks.

solution

Floop allows visual feedback annotation right on the website and hence reduces the low-context confusion for feedback seekers. It offers one place to organize and manage feedback. Moreover, encourages actionable feedback through guided questions.

impact

Helps feedback seekers manage and organize feedbacks 3x more efficiently.

my role

0->1 design (research + validate + visuals) of this tool, prototyped and implemented with Lovable.dev AI tools.

Tldr;

Why is it broken?

Why is it broken?

Why is it broken?

Why is it broken?

Why is it broken?

Why is it broken?

For feedback seeker

Confusing low-context feedback from reviewers

For feedback seeker

Scattered feedback across different channels

For reviewer

Screenshotting for context is tedious & time consuming

Hear it from the people

Hear it from the people

Hear it from the people

Hear it from the people

Hear it from the people

Hear it from the people

I have conducted user interviews with 20+ feedback seekers (early-career designers) and 10+ reviewers.

"

Over text, mentors gave 1–2 lines per question. For visuals, they’d say, 'This isn’t working,' but no screenshots.

~ Design student

"

Notes scatter across physical books/digital apps...I often forget feedback

~ Junior designer (2 years of experience)

Opportunity

An innovative way to do async portfolio reviews

Iteration 1 (with wireframes)

Conceptualized visual annotation & tested it

Conceptualized visual annotation & tested it

Conceptualized visual annotation & tested it

Conceptualized visual annotation & tested it

Conceptualized visual annotation & tested it

Conceptualized visual annotation & tested it

Feedback annotations solves the issue of location ambiguity in portfolio reviews. Annotation eliminates confusion about feedback placement because it enables direct commenting on specific interface elements. It also reduces time spent on clarification, creating a more efficient review experience for both reviewers and feedback seekers.

Dashboard that organizes feedback

Dashboard that organizes feedback

Dashboard that organizes feedback

Dashboard that organizes feedback

Dashboard that organizes feedback

Dashboard that organizes feedback

In the current system, feedback seeker has to do all the work of keeping all feedbacks in one doc or one figma or organize them, but f/oop automatically does that.

Insights from iteration 1 testing

Insights from iteration 1 testing

Insights from iteration 1 testing

For feedback seeker

There needs to be a way to save comments that resonate with users

For feedback seeker

They want to see if their portfolio link was opened or not

For reviewer

They want to know what's the goal behind the portfolio review ask

Prototyped using Lovable after iteration 1

Iteration 2

Added "guiding questions" to help reviewers

Added "guiding questions" to help reviewers

Added "guiding questions" to help reviewers

Added "guiding questions" to help reviewers

Added "guiding questions" to help reviewers

Added "guiding questions" to help reviewers

This helps reviewers get an idea of what's the aim of feedback seeker when asking for portfolio reviews so that they can provide actionable feedback to them.

Dashboard designed by lovable and tweaked for UX by me

Dashboard designed by lovable and tweaked for UX by me

Dashboard designed by lovable and tweaked for UX by me

I wanted to see how Lovable thinks in terms of design, so I let it design and made improvements to it like color, typography, layout and interactions.

Added "save to implement later" list & link opened status

Added "save to implement later" list & link opened status

Added "save to implement later" list & link opened status

Feedback seekers wanted a way to have a place where they can store the feedback that has resonated with them, one place to make it super easy to search.

Insights from iteration 2 testing

Insights from iteration 2 testing

Insights from iteration 2 testing

For feedback seeker

Users want to see comment with its context in the dashboard

For feedback seeker

When there are too many feedback, it becomes harder to go through all of them

For reviewer

They are doing all the work, but feels like not getting enough value

Iteration 3

Redesigned Dashboard (no AI, just me) with contextual comments

Redesigned Dashboard (no AI, just me) with contextual comments

Redesigned Dashboard (no AI, just me) with contextual comments

Redesigned Dashboard (no AI, just me) with contextual comments

Redesigned Dashboard (no AI, just me) with contextual comments

Redesigned Dashboard (no AI, just me) with contextual comments

Opening the dashboard should give them an idea of the overview and hence, a new dashboard

Added "AI actionable takeways"

Added "AI actionable takeways"

Added "AI actionable takeways"

This is not AI summary, summaries always throws a lot of unwanted info to the users, but AI takeaways give them 3 actionable points to improve upon.

Insights from iteration 3 testing

Insights from iteration 3 testing

Insights from iteration 3 testing

For feedback seeker

Seekers want to give feedback to their peers/friends

For reviewer

They are doing all the work, but feels like not getting enough value

For simplicity and length of this case study, skipping iteration 3, 4 & 5

On going & future work (iteration 6)

On going & future work (iteration 6)

On going & future work (iteration 6)

For reviewer

Give autonomy to reviewers to create and share packages for portfolio reviews

For reviewer

Provide more qualitative and quantitative value to the reviewers

Floop won Most Innovative Idea at InnoQuest Innovation competition at Indiana University in April 2025.

It was also selected for top 100 projects in Lovable's hackathon.

Self reflection

Prototyping with Lovable was super fun and useful. I can't believe that I got MVP set up in 2 nights and 100 prompts. The way I got micro ideas to be implemented from the AI prototype was super useful.

Another reflection is, I learned how to showcase the value in terms of outcomes rather than just results. For example, junior designers get confidence through this tool rather than just feedbacks.

If you are a designer who has feedback on this tool, reach out to me and I would love to talk about it.

Hit me up to talk about this project over a cup of chai!

How did I AI prototype?

How did I AI prototype?

How did I reach out to designers?

How did I reach out to designers?

What inspired you to create this tool?

What inspired you to create this tool?

Copyright © 2025 Dharam Lokhandwala. Last updated on Oct 2025.

Copyright © 2025 Dharam Lokhandwala.

Updated in January 2025.

Process

Design approach

Process

Design approach

Process

Design approach

Process

Design approach

Process

Design approach

Process

Design approach

Process

Design approach