Serial dependence generalizes across the senses

Through a series of experiments combining behavioral and EEG data, this study demonstrates that serial dependence in numerosity perception generalizes across vision and audition via a functional, mid-level multisensory network, thereby challenging existing low-level and high-level accounts of the phenomenon.

Original authors: Fornaciai, M., Togoli, I., Binisti, S., Collignon, O.

Published 2026-03-17
📖 5 min read🧠 Deep dive
⚕️

This is an AI-generated explanation of a preprint that has not been peer-reviewed. It is not medical advice. Do not make health decisions based on this content. Read full disclaimer

Imagine your brain is like a very busy, slightly forgetful artist trying to paint a picture of the world in real-time. Sometimes, this artist gets a little "sticky." If they just painted a blue sky, they might accidentally make the next sky look a little bluer than it actually is. If they just heard a fast drumbeat, the next drumbeat might sound a tiny bit faster.

This "stickiness" is called Serial Dependence. It's a quirk of our brains where what we just experienced pulls our current perception toward it, making things look or sound more similar to the past than they really are.

For a long time, scientists argued about why this happens. They had two main theories:

  1. The "Low-Level" Theory: This suggests the stickiness happens inside each sense separately. Like, your eyes have their own sticky glue, and your ears have their own, and they never talk to each other.
  2. The "High-Level" Theory: This suggests the stickiness happens after you've seen or heard something, when your brain is making a decision. It's like your brain's "decision committee" getting confused by the last meeting and carrying that confusion into the next one.

The Big Question: Can the "stickiness" jump from one sense to another? If I just saw a bunch of flashing lights, will that make a sound I hear next seem like it has more "flashes" (or a different number of beats)?

This paper says: Yes, it can! But there's a catch: Attention is the gatekeeper.

Here is the story of their three experiments, explained simply:

Experiment 1: The "Sticky" Senses

The researchers set up a game. Participants had to guess how many things they saw or heard.

  • The Setup: First, a "distractor" appeared (either a flash of light or a beep). Then, a "target" appeared (another flash or beep).
  • The Trick: The distractor was irrelevant to the game, but the researchers wanted to see if it still "stuck" to the target.
  • The Result:
    • When participants were only paying attention to one sense (just looking at lights), the "stickiness" worked perfectly. If they saw a lot of lights before, they thought the next light sequence had more lights. Even if the distractor was a sound, it still made the lights look "stickier."
    • The Surprise: When participants had to pay attention to both senses at the same time (looking at lights and listening to sounds), the "stickiness" changed. The senses stopped sticking to themselves (light didn't stick to light) and started sticking to the other sense (sound made the lights stickier, and lights made the sounds stickier).

The Analogy: Imagine you are juggling red balls (vision) and blue balls (audition).

  • If you only care about the red balls, the red balls stick to each other, and the blue balls accidentally stick to the red ones too.
  • But if you are juggling both at once, the red balls stop sticking to other red balls and start sticking to the blue ones instead! The brain is mixing them together because it's trying to handle the whole picture.

Experiment 2: The "Traffic Cop" (Attention)

The researchers wanted to see if they could control this stickiness with a "traffic cop" (attention).

  • The Setup: They showed a mix of lights and sounds together, but told the participant: "Only look at the lights!" or "Only listen to the sounds!"
  • The Result: The "traffic cop" worked. If you were told to focus only on the lights, only the lights influenced your next guess. The sounds were ignored.
  • The Twist: However, if the lights and sounds were "confused" (one was fast, one was slow), the brain got picky. It would only stick to the thing it was told to watch.

The Analogy: Think of your brain as a radio.

  • If you are listening to one station (Vision), the static from another station (Audition) might bleed through and change how you hear the music.
  • But if you turn the dial to focus only on that station, the static disappears. The "stickiness" only happens with the channel you are tuned into.

Experiment 3: The "Brain Camera" (EEG)

Finally, they hooked people up to EEG machines (which take pictures of brain waves) to see when this stickiness happens.

  • The Question: Does this happen when the brain is seeing the object (perception), or only after the person thinks about the answer (decision)?
  • The Result: The brain waves changed while the person was still looking at the stimulus. The "stickiness" happened almost immediately, before the person could even make a decision.
  • The Conclusion: This proves the "High-Level" theory (decision-making) is wrong. The stickiness isn't a mistake in the decision committee; it's a feature of the sensory processing itself.

The Analogy: It's like putting on a pair of sunglasses that tint the world slightly based on what you saw five seconds ago. You don't decide to tint the world; the tint happens automatically as the light hits your eyes.

The Takeaway

This paper teaches us that our brain is a multisensory mixer, not a set of isolated rooms.

  1. It's Connected: Your eyes and ears talk to each other. What you see can change what you hear, and vice versa.
  2. It's Flexible: Your brain decides how much to mix them based on what you are paying attention to. If you are busy with one thing, it keeps the senses separate. If you are busy with everything, it blends them together to create a stable, continuous experience.
  3. It's Early: This happens right at the beginning of processing, helping your brain build a smooth, stable movie of reality, even when the raw data is noisy and chaotic.

In short, your brain isn't just a camera recording the world; it's a director that edits the footage in real-time, using the last scene to smooth out the next one, and it uses all its senses to do the editing.

Drowning in papers in your field?

Get daily digests of the most novel papers matching your research keywords — with technical summaries, in your language.

Try Digest →