Information Inequalities for Five Random Variables

This paper utilizes a computationally optimized variant of the Maximum Entropy Method to derive and prove two infinite families of non-Shannon entropy inequalities for five random variables, thereby advancing the understanding of the structure of the five-variable entropic region.

E. P. Csirmaz, L. Csirmaz

Published 2026-03-04
📖 5 min read🧠 Deep dive

Imagine you are trying to map the shape of a mysterious, invisible island. This island isn't made of land and water, but of information. In the world of math and computer science, this island is called the Entropy Region.

Every point on this island represents a possible way that information can be shared, stored, or transmitted between different variables (like random events or data sources). For a long time, mathematicians knew the "coastline" of this island for small groups of variables (up to four). They knew the basic rules, called Shannon inequalities, which act like the ocean's horizon: you can't have negative information, and certain combinations of data must follow specific limits.

However, for five variables, the map was mostly blank. We knew the ocean (the Shannon rules), but we didn't know if there were hidden reefs, cliffs, or caves inside the island that the ocean rules didn't describe. These hidden features are called non-Shannon inequalities. Finding them is like discovering that the island has a secret underground tunnel system that changes how you can navigate it.

The Problem: A 31-Dimensional Maze

The authors of this paper, Csirmaz and Csirmaz, tackled the five-variable case. The problem is that five variables create a 31-dimensional space. Imagine trying to draw a map of a maze that exists in 31 dimensions instead of 2 or 3. It's impossible for a human to visualize, and even for computers, the number of paths to check is so huge that it would take longer than the age of the universe to solve using standard methods.

The Solution: The "Maximum Entropy" Recipe

To solve this, the authors used a clever trick called the Maximum Entropy Method (MEM).

Think of it like this:

  1. The Setup: You have a group of friends (the random variables) sharing secrets. You know how much they share with each other in specific pairs.
  2. The Copy Trick: The authors imagine creating "clones" of some of these friends. They create nn copies of one friend and mm copies of another.
  3. The Rule of Maximum Entropy: Nature loves chaos (entropy). If you have a system where some rules are fixed (the original secrets), but the rest is free to vary, the system will naturally settle into the state with the most possible randomness (maximum entropy).
  4. The Discovery: By forcing these clones to follow the same rules as the originals, the authors found that the clones must obey new, stricter rules to maintain that maximum randomness. These new rules are the non-Shannon inequalities.

It's like realizing that if you have a group of people sharing a secret, and you make 100 copies of them, the way they can share information becomes much more restricted than if you only had 2 people. The "cloning" process reveals hidden constraints.

The Computational Challenge: Folding the Map

Even with this clever recipe, the math was still too heavy. The computer would get lost in the 31-dimensional maze.

The authors used two main strategies to shrink the maze:

  1. Symmetry (The Mirror Trick): They realized that swapping identical clones doesn't change the outcome. It's like realizing that in a room full of identical twins, it doesn't matter which twin is which; the group behaves the same way. This allowed them to ignore billions of redundant calculations.
  2. Tightening (The Squeeze): They focused only on the "tight" parts of the problem, ignoring the easy, obvious parts (modular parts) that didn't hide any secrets.

By combining these tricks, they managed to run their "cloning" experiment up to 9 generations (creating up to 9 copies of variables). This was the limit of what their computers could handle before the numbers got too messy (numerical instability).

The Big Discovery: Infinite Families of Rules

From these 9 generations of experiments, the authors didn't just find a few random rules. They found a pattern.

They realized these new rules follow a beautiful, infinite structure. They described them using downward-closed staircases on a grid.

  • The Analogy: Imagine a staircase going down from the top-left corner. You can only step down or to the right. Every unique shape of this staircase corresponds to a specific new rule about how information can be shared.
  • They proved that every one of these staircase shapes generates a valid, new rule that limits the entropy region.
  • They even developed an algorithm to list all the "essential" staircases (the ones that aren't just copies of others) up to generation 60, far beyond what their computers could directly calculate.

Why Does This Matter?

You might ask, "Who cares about invisible 31-dimensional islands?"

These rules have real-world applications:

  • Network Coding: Imagine sending a video stream to thousands of people. These new rules tell engineers the absolute theoretical limit of how fast data can flow. If a network tries to go faster than these new rules allow, it's mathematically impossible, no matter how good the hardware is.
  • Secret Sharing: If you want to split a password among 5 people so that only certain groups can unlock it, these rules tell you the minimum size the password fragments must be.
  • AI and Causality: When AI tries to figure out if A causes B, these rules help it eliminate impossible scenarios, making the AI smarter and more accurate.

The Conclusion

The paper is a tour de force of mathematical detective work. The authors took a problem that seemed too big to solve (mapping a 31D information island), built a special tool (the cloning method), used symmetry to shrink the problem, and discovered an infinite family of hidden laws.

They conjecture that they have found all the rules that this specific method can produce. While the map of the 5-variable entropy region is still not 100% complete, they have filled in a massive, previously unknown section of the coastline, revealing that the island is much more complex and structured than anyone previously imagined.