Pregistered movie-fMRI analyses reveal altered visual feature encoding in autism in pSTS

Using preregistered movie-fMRI encoding models, this study reveals that autistic children and adolescents exhibit reduced high-level visual feature representation and a relative shift toward low-level processing in social brain regions like the pSTS, supporting weak central coherence theories over early sensory enhancement hypotheses.

Original authors: Mentch, J., Chen, Y., Vanderwal, T., Ghosh, S. S.

Published 2026-03-24
📖 6 min read🧠 Deep dive
⚕️

This is an AI-generated explanation of a preprint that has not been peer-reviewed. It is not medical advice. Do not make health decisions based on this content. Read full disclaimer

The Big Picture: Watching a Movie to Understand the Autistic Brain

Imagine you are trying to understand how two different types of radio receivers work. One is a standard receiver (neurotypical brains), and the other is a specialized receiver (autistic brains). Both are tuned to listen to the same complex radio show (a movie).

For years, scientists had a theory that the "specialized" receiver was just super-powered. They thought autistic people heard every tiny crackle and saw every single pixel with super-sharp clarity (a theory called "Enhanced Perceptual Functioning").

This study says: "Actually, that's not quite right."

Instead of being a super-powered receiver, the autistic brain seems to be a receiver that is tuned differently. It's not that the signal is louder; it's that the brain is focusing on the details of the signal (like the static or the background noise) and missing the big story (the plot or the faces).


The Experiment: The "Stacked" Decoder

The researchers didn't just ask kids to sit still and look at a dot. They let them watch natural movies (like Despicable Me and The Present) while inside an MRI machine.

To understand what the brain was doing, they used a tool called an Encoding Model. Think of this like a recipe decoder:

  1. The Ingredients: They broke the movie down into "ingredients."
    • Low-level ingredients: Brightness, motion, loudness, simple sounds.
    • High-level ingredients: Faces, bodies, speech, music, social cues.
  2. The Decoder: They asked the computer: "Which ingredients does the brain care about most?"
  3. The Stacking: They used a "stacked" model. Imagine you have two chefs. One only cooks with basic spices (low-level), and the other only cooks with complex sauces (high-level). The "stacked" model asks: "If we combine both chefs, how much of the final dish does each chef actually contribute?"

The Findings: What They Discovered

1. No "Super-Vision" in the Basics

The Myth: Autistic people see and hear basic things (like a flashing light or a loud beep) much better than everyone else.
The Reality: When the researchers looked at the "primary sensory" parts of the brain (the first stop for sight and sound), there was no difference. The autistic brain wasn't "super-charged" at the entry level. The volume knob wasn't turned up higher.

2. The "Social Hub" Got Stuck on Details

The Discovery: The big difference showed up in a specific area called the pSTS (posterior Superior Temporal Sulcus). You can think of this as the brain's "Social Integration Station." It's where the brain usually takes all the little details and combines them to understand a face, a gesture, or a conversation.

  • In Neurotypical Brains: This station acts like a Director. It says, "Ignore the background noise; focus on the actor's face and what they are saying." It prioritizes the high-level story.
  • In Autistic Brains: This station acts more like a Sound Engineer focused on the raw audio. It says, "Let's focus on the texture of the voice and the movement of the lips, but maybe we're missing the emotional context."

The autistic brain in this study showed a shift in priority: it was spending more energy on the "low-level" details (motion, brightness) and less on the "high-level" meaning (faces, social cues).

3. The "Volume" of Symptoms

The researchers found a direct link between this "tuning" and how severe a person's social symptoms were.

  • Analogy: Imagine a radio dial. The further you turn the dial toward "Static/Details" (low-level), the harder it is to hear the "Music/Story" (high-level).
  • The Result: The more a person's brain was tuned to the "details" in the pSTS, the higher their scores were on the Social Responsiveness Scale (SRS). It's like a volume knob: the more you focus on the pixels, the quieter the social story becomes.

4. No "Audio vs. Visual" War

There was a theory that autistic people might rely more on hearing than seeing (or vice versa), like a "Reverse Colavita Effect."
The Reality: The study found that the balance between hearing and seeing was mostly the same for both groups. The autistic brain wasn't suddenly becoming "deaf" to visuals or "blind" to audio. The difference was what they were paying attention to within those senses (details vs. meaning), not which sense they preferred.

5. Age Matters More Than Diagnosis

One of the most interesting findings was about growing up.

  • Analogy: Think of the brain like a city. When you are a child, the roads are a bit messy, and traffic goes everywhere. As you get older, the city builds better highways.
  • The Result: As children got older, their brains naturally got better at separating "visual roads" from "audio roads." This developmental change was actually stronger than the difference between autistic and non-autistic groups. The brain is constantly rewiring itself as it matures, and this study captured that journey.

The "ADHD" Twist

The researchers also noticed something interesting about ADHD.

  • The "detail-focused" shift in the autistic brain was mostly driven by autistic kids without ADHD.
  • Autistic kids with ADHD looked more like the non-autistic group in how they processed these movies.
  • Takeaway: This suggests that autism and ADHD might affect the brain in different ways, and mixing them together in studies can sometimes hide the true picture.

The Conclusion: A New Way to Listen

This study changes the story. It suggests that autism isn't about having "superpowers" in basic senses. Instead, it's about how the brain weighs information.

In the social parts of the brain, the autistic brain seems to be over-weighting the details (the pixels, the motion) and under-weighting the big picture (the face, the emotion). It's not that the signal is broken; it's that the brain is prioritizing the "ingredients" over the "recipe."

By using movies and these advanced "decoders," scientists can now see exactly where and how this weighting happens, opening the door to better understanding and support for neurodevelopmental differences.

Drowning in papers in your field?

Get daily digests of the most novel papers matching your research keywords — with technical summaries, in your language.

Try Digest →