LikeThis! Empowering App Users to Submit UI Improvement Suggestions Instead of Complaints

This paper presents LikeThis!, a GenAI-based approach that empowers users to transform vague UI complaints into constructive, actionable feedback by generating concrete design improvement alternatives from user comments and screenshots, which was validated through model benchmarking and a user study showing enhanced feedback quality and developer understanding.

Jialiang Wei, Ali Ebrahimi Pourasad, Walid Maalej

Published 2026-03-05
📖 5 min read🧠 Deep dive

Imagine you're using a new app, and you see a button that's way too small, or a text that's impossible to read. You want to tell the developer, "Hey, fix this!" But here's the problem: you're not a designer. You don't know the right words to describe how to fix it. You might just say, "This is bad," or "I hate this button."

To the developer, that's like getting a note that says "The house is broken" without knowing if the roof is leaking, the door is stuck, or the lights are out. It's frustrating for both sides.

This paper introduces a new tool called LikeThis! (yes, with an exclamation mark!) that acts as a universal translator between confused users and busy developers. Instead of just complaining, it helps you say, "I like it like this," by showing you exactly what the fix could look like.

Here is how it works, broken down into simple concepts:

1. The Problem: The "Vague Complaint" Trap

Usually, when you find a bug, you leave a text review. But text is often vague.

  • You say: "The font is too small."
  • Developer hears: "Make the font bigger? On which screen? By how much? Should I change the color too?"
  • Result: The developer is stuck guessing, or they ignore the feedback because it's too hard to act on.

2. The Solution: LikeThis! (The "Magic Mirror")

The researchers built an app feature powered by Generative AI (the same kind of tech that makes funny pictures from text). Here is the workflow:

  1. Snap & Speak: You take a screenshot of the problem and type a quick note (e.g., "I accidentally hit the call button while typing my number").
  2. The AI Brainstorm: The AI doesn't just read your note; it acts like a creative architect. It looks at your screenshot and your note, then instantly generates three different visual solutions.
    • Option A: Add a "Confirm Call?" pop-up window.
    • Option B: Change the button to a "Slide to Call" slider.
    • Option C: Make the button smaller and move it away from the number pad.
  3. You Pick: You look at the three new designs and say, "Oh, I like Option B! That's exactly what I wanted."
  4. The Handoff: You send that specific picture to the developer.

Now, instead of a vague complaint, the developer receives a clear blueprint: "The user wants a slide-to-call button here. Here is a picture of how it should look."

3. The "Secret Sauce": Two Steps, Not One

The researchers found that if you just ask the AI to "fix the picture," it often makes a mess (like a painter trying to fix a wall without knowing what color the paint should be).

So, they added a middle step: The "Suggestion Generation" phase.

  • Step 1: The AI first writes a text plan (e.g., "Change the button to a slider to prevent accidental clicks").
  • Step 2: Then it uses that plan to draw the new picture.

Think of it like building a house. You don't just tell the construction crew, "Build a house." You first give them the blueprints (the text plan), and then they build the house (the image). The study showed that this two-step process made the final pictures much better and more accurate.

4. The Results: Does It Actually Work?

The team tested this in two ways:

  • The Robot Test: They fed the AI 300 real app screenshots with known problems. They asked different AI models to fix them. The winner was GPT-Image-1, which was much better at fixing the problem without breaking other parts of the screen (like a surgeon fixing a heart without cutting the wrong artery).
  • The Human Test: They gave 15 regular people (like dentists, teachers, and lifeguards) 10 real apps to use.
    • Users loved it: 85% of the time, users felt the AI's suggestions were "very accurate" to what they were trying to say. They felt heard.
    • Developers loved it: When developers saw the feedback with the AI-generated pictures, they said, "Aha! Now I know exactly what to do!" The feedback went from "somewhat understandable" to "clearly actionable."

5. The Catch (and the Future)

It's not perfect yet.

  • Speed: Generating these pictures takes about a minute. In a world where we want instant answers, that feels like an eternity.
  • Context: The AI fixes one screen at a time. It might not realize that changing a button here breaks the flow of the whole app later.
  • Trust: Users need to trust that the AI isn't hallucinating weird designs.

The Big Picture:
This paper suggests a future where you don't need to be a designer to be a designer. You just need to know what you don't like, and the AI will help you visualize what you do like. It turns "complaining" into "collaborating," making apps better for everyone, faster.

In a nutshell: LikeThis! is the "Show, Don't Tell" button for app users. It turns your frustration into a clear, visual instruction manual for the developers.