Free Information Disrupts Even Bayesian Crowds

This paper uses an agent-based model to demonstrate that even among idealized, truth-seeking agents, unconstrained information exchange can degrade collective belief accuracy, suggesting that carefully designed constraints on information flow are necessary for societal communication networks like social media.

Jonas Stein, Shannon Cruz, Davide Grossi, Martina Testori

Published 2026-04-03
📖 5 min read🧠 Deep dive

The Big Idea: "More Information" Isn't Always "Better"

We live in a world that loves the idea of Free Information. The mantra of the internet age is "Information wants to be free." We believe that if everyone shares everything, everyone will learn the truth faster. We think that in a group of smart, honest people, the more they talk to each other, the smarter the group becomes.

This paper says: Not so fast.

The authors ran a computer simulation to test this. They created a group of "perfect" agents—robots that are perfectly logical, perfectly honest, and perfectly good at math. They wanted to see if these perfect robots could figure out the truth just by sharing as much information as possible.

The Shocking Result: Even with perfect robots, if they only talk to people who already agree with them, sharing too much information actually makes the group dumber.


The Analogy: The "Echo Chamber" Dinner Party

Imagine a dinner party where everyone is trying to figure out if a new, weird fruit is Sweet (State A) or Sour (State B).

  1. The Setup:

    • There are 100 guests.
    • Each guest has tasted the fruit once. Most tasted it correctly (it's Sweet), but a few unlucky guests tasted it wrong and think it's Sour.
    • The goal is for the whole group to agree that the fruit is Sweet.
  2. The Rules of the Game:

    • Perfect Logic: Everyone is a math genius. If they hear a new fact, they update their opinion perfectly.
    • Total Honesty: Everyone shares their best evidence. They don't lie.
    • The "Homophily" Rule (The Problem): People naturally prefer to sit next to and talk to people who already agree with them. If you think the fruit is Sweet, you mostly talk to other "Sweet" people. If you think it's Sour, you mostly talk to other "Sour" people.
  3. The Experiment:
    The researchers tested two scenarios:

    • Scenario A (Low Capacity): Guests can only whisper one piece of evidence to their neighbor.
    • Scenario B (High Capacity): Guests can shout ten pieces of evidence to their neighbor.

What Happened?

1. When Everyone Mixes (Low Homophily)

If the guests are forced to sit with strangers, Scenario B (High Capacity) wins. The "Sour" people hear ten great arguments from the "Sweet" people, realize they were wrong, and the whole group agrees the fruit is Sweet. More info = Better truth.

2. When People Stick to Their Own Kind (High Homophily)

This is where it gets weird.

  • The "Sour" Group: The few people who think the fruit is Sour sit together. They are all wrong, but they are confidently wrong.
  • The High Capacity Trap: Because they can share ten pieces of evidence at once, they bombard each other with "proof" that the fruit is Sour. Even though their evidence is weak or accidental, hearing it ten times from ten different people makes them 100% convinced they are right.
  • The Result: The "Sour" group becomes a super-solid, unshakeable block of incorrect belief. They never hear the "Sweet" truth because they aren't talking to the "Sweet" people, and the "Sweet" people are too busy talking to other "Sweet" people to break through.

The Paradox: By giving the "Sour" group the ability to share more information, you accidentally gave them the tools to dig their own hole deeper. They reinforced their mistake so strongly that they became impossible to correct.

The Takeaway: The "Diet" of Information

The paper argues that unlimited information flow in a world where people naturally cluster with similar minds (like on social media algorithms) can be dangerous.

  • The Old Way: "Let everyone say everything! More data = More truth!"
  • The New Insight: Sometimes, limiting how much information people can share in a single conversation can actually help.

If the "Sour" group could only share one piece of evidence at a time, they might not be able to convince each other so strongly. They might leave a little room for doubt, or they might eventually hear a single, strong counter-argument from a "Sweet" person that breaks the spell.

Why This Matters for Real Life

We often think that social media algorithms should just show us everything so we can make the best decisions. But this paper suggests that if we are all in our own "echo chambers" (talking only to people like us), flooding those chambers with endless content might just make us more stubborn and wrong.

It suggests that designers of social networks (and teachers, doctors, and leaders) shouldn't just aim for "maximum sharing." Sometimes, curating or limiting the flow of information can actually help a group reach the truth faster and avoid getting stuck in a polarized, incorrect belief.

In short: Sometimes, a little silence is better than a lot of noise, especially if everyone in the room is already shouting the same wrong thing.

Get papers like this in your inbox

Personalized daily or weekly digests matching your interests. Gists or technical summaries, in your language.

Try Digest →