Structural and contextual biases interact to shape duration perception

This study demonstrates that human duration perception is shaped by the interaction of structural biases, such as the auditory dominance over visual stimuli, and contextual biases driven by environmental statistics, revealing that the brain employs a combination of Bayesian inference and rescaling mechanisms to normalize temporal representations.

Original authors: Grabot, L., Giersch, A., Mamassian, P.

Published 2026-03-20
📖 5 min read🧠 Deep dive
⚕️

This is an AI-generated explanation of a preprint that has not been peer-reviewed. It is not medical advice. Do not make health decisions based on this content. Read full disclaimer

The Big Idea: How Your Brain Tells Time

Imagine your brain is a master chef trying to guess how long a soup has been simmering. The chef doesn't have a stopwatch; they have to rely on two things:

  1. The Recipe (Structural Priors): Rules that are always true, like "sound travels faster than sight" or "sounds usually feel longer than lights."
  2. The Kitchen Context (Contextual Priors): The specific situation right now. Is the kitchen full of short, quick stirs? Or long, slow simmers?

This paper asks: How does the chef mix these two ingredients to decide if the soup is ready?

The researchers found that the brain does something surprisingly clever. It doesn't just follow the recipe or just look at the context. It does both, but it also performs a mental "stretch and shrink" trick to make sure the two ingredients can be compared fairly.


The Experiment: The "Which Lasted Longer?" Game

The researchers set up a game for 30 people.

  • The Game: Two sounds or two flashes of light (or one of each) would play one after the other.
  • The Task: The player had to press a button to say, "Which one lasted longer?"
  • The Twist: The researchers changed the "menu" (the context). Sometimes, the game only used short durations (like quick beeps). Other times, it used long durations (like slow chimes).

What they expected:
Usually, if you are used to hearing short beeps, a medium beep feels long. If you are used to long chimes, that same medium beep feels short. This is called the Central Tendency (or "Regression to the Mean"). It's like your brain saying, "This is average for this group, so I'll guess it's average."

What they found:
The brain did the opposite of what simple guessing would predict. When the brain saw a medium beep in a group of long chimes, it didn't just think "it's average." It actually thought, "Wait, this is shorter than the others!"

The Solution: The "Rubber Band" Analogy

To explain this weird result, the researchers proposed a model with two main steps:

1. The "Two Separate Notebooks" (Distinct Representations)

The brain doesn't mix all the short and long sounds into one big pile. Instead, it keeps two separate notebooks.

  • Notebook A: Contains the rules for the "Short" group.
  • Notebook B: Contains the rules for the "Long" group.
    When a sound plays, the brain knows exactly which notebook to open. It doesn't get confused by the other group.

2. The "Rubber Band Stretch" (Rescaling)

This is the most important discovery. Before the brain compares the two sounds, it stretches or shrinks the mental ruler.

Imagine this:
You are comparing the height of two people.

  • Person A is standing in a room where everyone is 6 feet tall.
  • Person B is standing in a room where everyone is 5 feet tall.

If you just look at them, Person A looks tall and Person B looks short. But to compare them fairly, your brain grabs a rubber band (the rescaling mechanism). It stretches the "5-foot room" and shrinks the "6-foot room" so that both people are standing on the same flat floor.

In the experiment, the brain realized: "Oh, this sound came from the 'Long' group, so I need to shrink my perception of it to compare it fairly to the 'Short' group." This rescaling fights against the usual "guessing the average" instinct.

The "Sound vs. Sight" Bias (Structural Priors)

The study also looked at a built-in bias in our brains: We always think sounds last longer than lights of the same length.

  • If a light flashes for 1 second and a beep lasts for 1 second, you will almost always say the beep felt longer.
  • The researchers found this is a hard-wired rule (a "structural prior") that everyone has, though some people have it more strongly than others. It's like having a permanent filter on your glasses that makes sound "stretch" more than sight.

Why Does This Matter?

This paper changes how we think about time perception.

  • Old Idea: Our brain is a passive recorder that gets confused by context.
  • New Idea: Our brain is an active editor. It constantly re-calibrates its internal clock to fit the current environment. It creates a "common language" for time so it can make fair comparisons, even when the environment changes.

Summary in a Nutshell

Your brain is like a smart translator.

  1. It knows that sounds naturally feel longer than lights (Structural Bias).
  2. It keeps separate dictionaries for different groups of time (Short vs. Long).
  3. Before making a decision, it stretches or shrinks the mental ruler (Rescaling) so that a "long" sound and a "short" sound can be compared on the same scale.

This explains why our perception of time is so flexible and why we can adapt so quickly to different environments, from a fast-paced video game to a slow, quiet library.

Drowning in papers in your field?

Get daily digests of the most novel papers matching your research keywords — with technical summaries, in your language.

Try Digest →