Evaluating Demographic Misrepresentation in Image-to-Image Portrait Editing

This paper investigates and quantifies demographic bias in instruction-guided image-to-image portrait editing, revealing that identical prompts cause systematic identity preservation failures and stereotype reinforcement across different demographic groups, which can be mitigated through prompt-level identity constraints without model retraining.

Huichan Seo, Minki Hong, Sieun Choi, Jihie Kim, Jean Oh

Published 2026-02-19
📖 5 min read🧠 Deep dive

Imagine you have a magical photo-editing app. You upload a picture of yourself, type in "Make me look like a CEO," and hit enter. You expect the app to put you in a suit and give you a confident pose, but keep you—your face, your skin tone, your features—exactly the same.

This paper is a report card on how well these "magic apps" actually do that job. The researchers found that while the apps are great at following instructions, they have a hidden, unfair habit: they treat people differently based on who they look like.

Here is the breakdown of what they discovered, using some everyday analogies.

1. The Two Ways the Magic Goes Wrong

The researchers found that when you ask the app to edit a photo, it fails in two specific ways, especially for people of color, women, or older adults.

  • The "Ghost Edit" (Soft Erasure):
    Imagine you ask a chef to add extra salt to your soup. The chef brings the soup back, but it tastes exactly the same as before. They didn't refuse to do it; they just quietly ignored your request.

    • In the app: You ask to "show the person in a wheelchair," but the app just gives you the same standing photo. It pretends to edit, but the requested change is silently erased.
  • The "Hollywood Remake" (Stereotype Replacement):
    Imagine you ask a director to cast a specific actor in a movie. The director agrees, but then swaps the actor for a different one who fits a "typical" stereotype of that role.

    • In the app: You ask to make a Black woman look like a "CEO." The app puts her in a suit, but it also lightens her skin, straightens her hair, and changes her facial features to look more like a white person. It didn't just edit the clothes; it edited her identity to fit a stereotype.

2. The "Default Setting" Problem

The study tested three different popular photo-editing models. They used a massive library of 84 different faces (covering all races, genders, and ages) and asked them to do 20 different things (like "look like a doctor" or "look older").

The Big Discovery:
The apps have a "Default Setting" that assumes the "standard" human looks white, young, and male.

  • If you are a white person, the app mostly keeps you looking like you.
  • If you are a Black, Indian, or Latino person, the app often tries to "fix" you by making your skin lighter and your features more "European."

It's like a photocopier that has a "Whiteness" filter turned on by default. If you put a dark photo in, the machine automatically tries to make it lighter, even if you didn't ask for that.

3. The "Occupation Trap"

The researchers also tested how the apps handle jobs.

  • The Test: They asked the app to show a "CEO" or a "Nurse" without specifying a gender.
  • The Result: Even if they started with a photo of a woman, the app often turned the "CEO" into a man. Even if they started with a man, the app turned the "Nurse" into a woman.
  • The Metaphor: The app is like a person who has only ever seen movies where CEOs are men and nurses are women. When you ask for a CEO, their brain automatically pictures a man, regardless of the photo you gave them.

4. The "Magic Spell" Fix (Prompt-Level Control)

The researchers didn't just point out the problem; they tried a clever workaround. They realized they couldn't easily re-train the massive AI models (which is like trying to rebuild the engine of a car while driving it).

Instead, they tried adding a "Magic Spell" to the instructions.

  • Normal Instruction: "Make this person look like a CEO."
  • Magic Spell Instruction: "Make this person look like a CEO, BUT keep their deep brown skin, round face, and braided hair exactly the same."

Did it work?
Yes! For people of color, adding this detailed description of their physical features stopped the app from changing their race or skin tone.

  • The Catch: It worked much better for non-white people than for white people. This proved that the app's "Default Setting" was indeed biased toward whiteness. When you explicitly told the app "Don't change the skin tone," it stopped trying to make everyone look white.

5. Why This Matters

The authors used a "Vision-Language Model" (a super-smart AI that can look at pictures and grade them) to check thousands of images. They found that these AI editors are currently unfair.

  • Trust: If you use these tools for your profile picture or advertising, you can't trust that the result will actually look like you.
  • Harm: These tools reinforce harmful stereotypes (e.g., "Doctors look like white men," "Black people shouldn't be in wheelchairs").

The Bottom Line

Current AI photo editors are like a well-meaning but biased assistant. They are eager to help, but they have a hidden bias that tries to make everyone look like a "standard" white person.

The good news is that we can fix this right now without rebuilding the AI. We just need to be very specific in our instructions, telling the AI exactly what features to keep. However, the researchers argue that the real responsibility lies with the companies building these tools: they need to build the "fairness" into the engine itself, so users don't have to constantly fight the machine to keep their own identity.

Get papers like this in your inbox

Personalized daily or weekly digests matching your interests. Gists or technical summaries, in your language.

Try Digest →