How Predicted Links Influence Network Evolution: Disentangling Choice and Algorithmic Feedback in Dynamic Graphs

This paper proposes a temporal framework based on multivariate Hawkes processes to disentangle intrinsic interaction tendencies from algorithmic feedback in evolving networks, introducing an instantaneous bias measure that effectively captures and characterizes the real-time reinforcement dynamics induced by link prediction models.

Mathilde Perez, Raphaël Romero, Jefrey Lijffijt, Charlotte Laclau

Published 2026-03-05
📖 5 min read🧠 Deep dive

Imagine you are the mayor of a bustling city where people constantly meet, shake hands, and form friendships. This city is your social network.

For a long time, scientists have studied this city by taking a single photograph (a "snapshot") and counting how many people are friends with others who look like them (same hair color, same job, same neighborhood). They call this homophily (the "birds of a feather" effect).

However, this paper argues that taking a single photo isn't enough. It's like trying to understand a movie by looking at just one frame. To really understand what's happening, you need to watch the whole film.

Here is the story of the paper, broken down into simple concepts:

1. The Two Reasons People Hang Out

The authors say there are two different reasons why people in our city tend to stick to their own groups:

  • Choice Homophily (The "Natural Taste"): This is like you naturally preferring to hang out with people who like the same music or food as you. It's your personal preference. It's who you want to be friends with.
  • Induced Homophily (The "Algorithmic Echo Chamber"): This is the tricky part. Imagine the city has a Mayor's Assistant (the Algorithm) who suggests new friends to everyone. If the Assistant sees that you like Jazz, they might only show you other Jazz fans. Even if you were open to trying Rock music, the Assistant never shows you any Rock fans. Over time, you only meet Jazz fans, not because you hate Rock, but because the Assistant never introduced you to them.

The Problem: When we look at the city, we can't tell the difference between "I only hang out with Jazz fans because I love Jazz" and "I only hang out with Jazz fans because the Assistant hid the Rock fans from me."

2. The Old Way vs. The New Way

  • The Old Way (Static Snapshots): Researchers used to look at the total number of friendships at the end of the year. They would say, "Look! 90% of friendships are within the same group!" But this is misleading. It's like looking at a snowball that has rolled down a hill. You see a giant snowball, but you don't know if it started small and grew huge because of the hill (the algorithm), or if it was already huge.
  • The New Way (The "Instantaneous" View): The authors propose a new tool. Instead of counting the total snowballs, they measure the wind speed right now. They ask: "Is the wind currently pushing people toward their own groups, or is it blowing them apart?"

They call this the "Instantaneous Bias." It's like checking the speedometer of a car right now, rather than just looking at the total distance traveled. This tells you if the algorithm is currently making things more segregated, even if the total number of friends hasn't changed much yet.

3. The "Hawkes" Engine

To measure this wind speed, they use a mathematical engine called a Hawkes Process.

  • Think of it like a campfire: If you throw a log on a fire, it gets hotter. If it gets hotter, it throws off more sparks, which makes the fire even hotter.
  • In our city, if two people from the same group meet, the algorithm gets excited and says, "Great! Let's show more people from this group to each other!" This creates a feedback loop. The Hawkes Process is the math that tracks how one interaction "throws a log" on the fire, making the next interaction more likely.

4. What They Found (The Experiments)

The authors ran simulations (like a video game version of the city) to test different "Mayor's Assistants" (Link Prediction models).

  • The "Standard" Assistant: This assistant just recommends friends based on what has happened before.
    • Result: It creates a snowball effect. It reinforces existing groups. If Group A talks to Group A, the assistant shows them more of Group A. The city becomes very segregated very quickly.
  • The "Fair" Assistant: This assistant tries to mix things up. It forces the system to show people from different groups.
    • Result: It breaks the snowball. The "Instantaneous Bias" meter drops, showing that the wind is now blowing people toward different groups.

The Big Surprise: The authors found that even if a "Fair" assistant looks good on paper (it has a low bias score in a snapshot), if you keep retraining it (updating its brain) without care, it can accidentally make the segregation worse over time because of the feedback loops.

5. The Real-World Test

They tested their theory on real data from Twitter/X during a German election.

  • They saw that right before the election, the "wind speed" (Instantaneous Bias) spiked. People were suddenly being pushed much harder into their own political bubbles.
  • This proved that their new tool could catch these rapid changes that old "snapshot" tools would have missed.

The Takeaway

This paper is a warning and a guide for the future of AI and social networks.

The Warning: You cannot just look at a network once and say, "This is fair." Algorithms change the network, and the network changes the algorithm. It's a dance, not a photo.

The Guide: To build fair systems, we need to stop looking at the "total distance traveled" (cumulative history) and start watching the "speedometer" (instantaneous dynamics). We need to understand that recommending a friend today changes who you will meet tomorrow.

If we want a diverse city, we can't just hope for it; we have to constantly adjust the "Mayor's Assistant" to ensure the wind keeps blowing in all directions, not just one.