Distributed representational encoding of food attributes in ventral visual cortex

Using fMRI and representational similarity analysis, this study demonstrates that the ventral visual cortex encodes food attributes through a functional dissociation where the lateral occipitotemporal cortex represents both visual and subjective properties, while the fusiform gyrus selectively encodes perceived caloric content beyond visual similarity.

Marrazzo, G., Pimpini, L., Kochs, S., De Martino, F., Valente, G., Roefs, A.

Published 2026-04-02
📖 5 min read🧠 Deep dive
⚕️

This is an AI-generated explanation of a preprint that has not been peer-reviewed. It is not medical advice. Do not make health decisions based on this content. Read full disclaimer

Imagine your brain is a massive, high-tech kitchen. When you look at a picture of a juicy burger or a crisp apple, your brain doesn't just see "food." It instantly runs a complex analysis: What does this look like? Does it look tasty? Is it healthy? How many calories are in it?

For a long time, scientists thought the part of the brain responsible for seeing objects (the Ventral Visual Cortex) was like a simple camera lens. They believed it only processed the "pixels"—the colors, shapes, and textures. They thought the "judgment" part (deciding if it's healthy or high-calorie) happened later, in a different part of the brain like the "manager's office" (the frontal lobes).

This study, however, suggests that the "camera lens" is actually much smarter than we thought. It's not just taking a photo; it's already starting to write the review.

Here is the breakdown of what the researchers found, using some simple analogies:

The Setup: The "Food Tasting" Experiment

The researchers put 25 women in an MRI machine (a giant camera that takes pictures of brain activity). They showed them 96 different pictures of food, from donuts to salads. While the participants looked at the pictures, they had to do a simple task (pressing a button when a colored cross appeared) so their brains would focus on the food images without them having to consciously think about the food.

After the scan, the participants rated every single food picture on a scale of 1 to 100 for:

  • How tasty it looked.
  • How healthy it looked.
  • How many calories they thought it had.

The Detective Work: RSA (The "Similarity Map")

Instead of just looking at which brain area "lit up" the brightest (like a simple on/off switch), the researchers used a technique called Representational Similarity Analysis (RSA).

Think of RSA like a social network for brain cells.

  • If your brain cells react to a donut and a cake in almost the same way, the brain puts them in the same "friend group."
  • If your brain cells react to a donut and a broccoli very differently, they are put in different groups.

The researchers asked: Do the brain's "friend groups" match up with how the humans rated the food?

The Big Discovery: Two Different "Chefs" in the Kitchen

The study found that the visual part of the brain isn't one single room; it has two distinct "chefs" (regions) that handle food information differently.

1. The "Generalist" Chef: LOTC (Lateral Occipitotemporal Cortex)

  • Location: A bit further back in the visual processing line.
  • What it does: This region is like a well-traveled food critic. It looks at the food and says, "This looks like a high-calorie treat, but it also looks unhealthy."
  • The Finding: The brain activity here matched the participants' ratings for both "How many calories?" and "How healthy?"
  • The Analogy: This region is like a general manager who looks at the menu and considers the whole picture: taste, health, and energy. It blends the visual look with the nutritional idea.

2. The "Specialist" Chef: Fusiform Gyrus

  • Location: A bit further forward in the visual line (often associated with face recognition, but here it's the "food specialist").
  • What it does: This region is like a nutritionist who only cares about the calorie count.
  • The Finding: This area was very specific. It organized the food images based only on how many calories the participants thought they had. It didn't care much about whether the food looked "healthy" or "tasty."
  • The Analogy: Imagine a robot that scans a burger and instantly calculates its energy output, ignoring the fact that it looks delicious or that it's bad for your heart. It isolates the "calorie" dimension perfectly, even after the researchers mathematically removed the visual similarities (like color or shape) from the data.

The Surprise: The "Manager's Office" was Quiet

The researchers also looked at the Orbitofrontal Cortex (OFC) and Insula. In previous studies, these are the "Manager's Office" and the "Body Sensor" where we usually expect to find feelings of hunger, reward, and value.

  • The Expectation: They thought these areas would show complex patterns matching the "tasty" or "healthy" ratings.
  • The Reality: These areas lit up (showed activity) when looking at high-calorie vs. low-calorie food, but they didn't show the complex patterns the researchers were looking for.
  • Why? The researchers suggest this is because these areas are like volatile weather systems. They change based on how hungry you are right now, your mood, or what you just ate. Because the study controlled for hunger (everyone ate a snack 2 hours before), the "weather" was too calm to see the complex patterns. The brain was in "neutral" mode, so the deep value judgments weren't fully active.

The Takeaway: Your Eyes Are Already Judging

The most exciting part of this paper is the conclusion: Your visual system isn't just a camera.

When you look at a picture of a greasy pizza, your brain's visual cortex isn't just seeing "red sauce and white cheese." It is already organizing that image in a way that reflects your knowledge that "this is high-calorie."

  • The "Generalist" area (LOTC) sees the whole story: "This is tasty but unhealthy."
  • The "Specialist" area (Fusiform) zooms in on the energy: "This is pure fuel."

It turns out that the part of your brain that sees the food is deeply connected to the part of your brain that values the food. They aren't separate steps; they are happening together, right at the moment you look at the food.

In short: You don't just "see" food and then "decide" if it's good for you. Your brain's visual system has already started the nutritional analysis before you even realize you're hungry.

Drowning in papers in your field?

Get daily digests of the most novel papers matching your research keywords — with technical summaries, in your language.

Try Digest →