Homotopy Cardinality and Entropy

This paper establishes a bridge between homotopy type theory and information theory by defining probability types, demonstrating that homotopy cardinality respects dependent sums but not products, and deriving Shannon entropy and its chain rule as the homotopy cardinality of specific types constructed via deloopings of finite cyclic groups.

Andrés Ortiz-Muñoz

Published Mon, 09 Ma
📖 5 min read🧠 Deep dive

Imagine you are trying to measure the "size" of a shape, but not just its physical area or volume. You want to measure its complexity and uncertainty.

This paper, written by Andrés Ortiz-Muñoz, is a bridge between two very different worlds:

  1. Homotopy Type Theory: A branch of math that treats logical statements as shapes and spaces.
  2. Information Theory: The science of data, probability, and how much "surprise" is in a message (Entropy).

The author's big discovery? Shannon Entropy (the standard measure of uncertainty in information theory) is actually just a specific way of counting the "size" of a mathematical shape.

Here is the breakdown using simple analogies:

1. The "Weighted" Size of a Shape

In normal math, if you have a set of 3 apples, the size is 3.
In this paper's world (Homotopy Cardinality), things are more like a bouncy castle.

  • If you have a simple apple, it counts as 1.
  • If you have a shape that can wiggle or rotate in specific ways (like a spinning top), it counts as a fraction.
  • The Rule: The more ways a shape can wiggle (its symmetries), the smaller its "cardinality" (size) becomes.
    • Analogy: Imagine a fair coin. It has two sides (Heads, Tails). But it also has a "flip" symmetry. In this math, the "size" of the coin isn't 2; it's actually 1 (because the two sides are balanced by the flip). This is called a Probability Type. It's a shape that perfectly represents a 100% chance of something happening.

2. The "Surprise" Machine (Entropy)

Entropy is a measure of how much you don't know.

  • If you know a coin is rigged to always land on Heads, there is zero surprise (Zero Entropy).
  • If you have a fair coin, there is maximum surprise (High Entropy).

The paper asks: Can we build a shape where the "size" of the shape equals the "surprise" of the coin?

The Answer: Yes.
The author builds a weird, abstract shape using "loops" and "cycles" (think of a necklace made of beads).

  • He uses a mathematical trick involving logarithms (the math behind compound interest and information).
  • He constructs a shape that represents "everything that is not the current outcome."
  • When he calculates the "size" (Homotopy Cardinality) of this specific shape, the number that pops out is exactly the Shannon Entropy.

The Metaphor:
Imagine you are trying to guess a secret code.

  • Entropy is how many questions you need to ask to figure it out.
  • The author built a giant, abstract "Question Machine" (a type).
  • The "weight" of this machine is exactly equal to the number of questions you need to ask.

3. The Chain Rule (The "Lego" Problem)

In information theory, there is a famous rule called the Chain Rule. It says:

The total surprise of a story is the surprise of the first sentence, plus the average surprise of the rest of the story, given the first sentence.

The paper proves that this rule works in their shape-world, BUT with a catch.

  • The Catch: The "rest of the story" must be independent of the "first sentence."
  • The Analogy: Imagine building a tower out of Lego blocks.
    • If the blocks just stack on top of each other (independent), the total height is just the sum of the parts.
    • But if the blocks are twisted or glued together in a weird way (dependent), the total height changes.
    • In the math paper, this "twisting" is called Transport Action. If the shape twists as you move through it, the simple addition rule (Chain Rule) breaks. The paper shows exactly when the rule works (when there is no twisting) and when it fails.

4. What This Means for Us

This isn't just abstract math for math's sake. It suggests that uncertainty is a geometric property.

  • Old View: Entropy is a formula you calculate with numbers.
  • New View: Entropy is the physical "volume" of a shape made of logic and probability.

The author also points out where this math gets tricky. Sometimes, when you try to combine shapes (like multiplying functions), the "size" doesn't behave the way you expect. It's like trying to measure the volume of a shadow; sometimes the shadow disappears even though the object is there.

Summary

  • The Goal: To find the "size" of uncertainty.
  • The Method: Build a mathematical shape that represents a probability distribution.
  • The Result: The "size" of this shape is exactly the Shannon Entropy.
  • The Lesson: Information theory and the geometry of shapes are secretly the same language. When you understand the shape of a probability, you understand the information it carries.

It's like discovering that the "weight" of a mystery novel is exactly equal to the number of pages you have to turn to solve the mystery. The paper gives us the ruler to measure that weight.