Gender Bias in Perception of Human Managers Extends to AI Managers

This study demonstrates that gender biases in leadership perceptions, where male managers are favored and female managers face greater skepticism upon making unfavorable decisions, extend from human leaders to AI managers, highlighting the need to address these prejudices in the design of AI-driven organizational systems.

Hao Cui, Taha Yasseri

Published Wed, 11 Ma
📖 5 min read🧠 Deep dive

Imagine you are playing a team game with two friends. You all solve puzzles together, and at the end, a "Manager" picks one person to win a small cash prize.

Now, imagine that Manager isn't just a person you know. Sometimes, the Manager is a human. Sometimes, it's an Artificial Intelligence (AI). And sometimes, the Manager is introduced as a man, a woman, or they don't say who they are.

This paper is like a giant experiment to see how we react when that Manager makes a choice. The researchers wanted to know: Does it matter if the Manager is human or a robot? And does it matter if we think the Manager is a man or a woman?

Here is the story of what they found, explained simply:

1. The Setup: The "Robber" Game

The researchers put 556 people online into teams of three. They played a simple game called "Find the Robber," where you have to spot a bad guy in a cartoon picture.

  • Round 1: Everyone played alone.
  • Round 2: Teams played together.
  • The Twist: A Manager (either Human or AI) was assigned to the team. This Manager was randomly labeled as Male, Female, or Unspecified.
  • The Prize: The Manager picked one "Best Player" to get an extra 50 pence. (In reality, the pick was random, but the players didn't know that).

Afterward, the players were asked: How much do you trust this manager? Are they smart? Are they fair? Would you work with them again?

2. The Big Surprise: The "Golden Ticket" Effect

The most obvious result was what you'd expect: If you won the prize, you loved the manager. If you lost, you didn't.

  • The Winner's Lens: If you got the award, you thought the manager was a genius, super fair, and someone you'd love to work with again.
  • The Loser's Lens: If you didn't get the award, you thought the manager was unfair and incompetent.

But here is where the gender bias crept in.

3. The Gender Trap: The "Double Standard"

The researchers found that the type of manager (Human vs. AI) and their gender changed how people reacted, especially when things went wrong.

  • The "Male Shield": When a manager was presented as Male (whether human or AI), they were given a lot of grace.

    • If a man won the prize, people thought, "Great choice!"
    • If a man didn't pick you, people were still relatively okay with him. They didn't get as angry. It's like he had a "shield" protecting his reputation.
  • The "Female Glass Ceiling": When a manager was presented as Female, the rules were much harsher.

    • Human Women: If a human woman didn't pick you, you were a bit annoyed, but you understood she might have her reasons.
    • AI Women: This was the worst combination. If the manager was an AI presented as a woman and she didn't pick you, people got very angry. They rated her as the least fair, least competent, and least trustworthy.

Think of it like this:
Imagine a male coach and a female coach. If the male coach picks the wrong player, people say, "He's having a bad day." If the female coach picks the wrong player, people say, "She doesn't know what she's doing."
Now, imagine that female coach is a robot. People seem to think, "Not only is she a woman making a mistake, but she's a robot pretending to be a woman making a mistake." The bias against women gets amplified when they are machines.

4. Why Does This Happen?

The paper suggests that we carry our human prejudices into the digital world.

  • We Project: We treat AI like people. We give them names, voices, and genders.
  • We Stereotype: We have old-school ideas that "Men are leaders" and "Women are helpers." When an AI acts like a leader, we judge it by these old rules.
  • The "Uncanny" Doubt: When a female AI makes a decision that feels unfair, we don't just blame the algorithm; we blame the gender of the algorithm. We are quicker to doubt a female AI than a male AI.

5. The Takeaway for the Future

This study is a warning sign for the future of work. As companies start using AI to hire people, give promotions, or manage teams, they might accidentally create a system that is unfair to women.

If an AI manager is designed to sound or look female, and it makes a tough decision (like firing someone or not giving a bonus), that AI might get blamed much more harshly than a male AI or a human man.

The Bottom Line:
Technology isn't neutral. Even though AI is made of code, the people using it bring their biases with them. If we want AI to be fair, we can't just fix the code; we have to fix our own minds. We need to realize that we are judging a robot's gender just as harshly as we judge a human's.

In short: Whether the boss is a human or a robot, if we think the boss is a woman, we are harder on her when she makes a mistake. And if that boss is a robot woman, we are the hardest of all.