Sharing is caring: Attestable and Trusted Workflows out of Distrustful Components

This paper presents Mica, a confidential computing architecture built on Arm CCA that decouples confidentiality from trust by enabling tenants to explicitly define, restrict, and attest communication paths between distrustful TEE components, thereby preventing sensitive data leakage without significantly expanding the trusted computing base.

Amir Al Sadi, Sina Abdollahi, Adrien Ghosn, Hamed Haddadi, Marios Kogias

Published 2026-03-10
📖 6 min read🧠 Deep dive

The Big Problem: The "Trust Me" Trap

Imagine you are building a high-security factory to process your most valuable secrets (like your bank details or private medical records). You hire three different companies to do the work:

  1. Company A sorts the data.
  2. Company B analyzes it.
  3. Company C stores the results.

In the current world of "Confidential Computing" (using special secure hardware called TEEs), each company gets its own super-secure, locked room. The hardware guarantees that no one outside can peek inside their room.

However, there is a catch: To do their job, these companies need to pass data to each other. They have to hand a package from Room A to Room B, and then to Room C.

Currently, the hardware says: "I will lock the rooms, but once you hand the package to the next room, I don't know what happens. You have to trust that Company B won't peek at the package, and that Company C won't steal it."

This is a fragile assumption. In the real cloud, these companies often don't trust each other, they might be rivals, or they might be owned by different people. Asking them to trust each other is like asking a spy to trust the person they are trying to catch. If one company is sloppy or malicious, your secrets leak.

The Solution: Mica (The "Strict Bouncer")

The researchers built a new system called Mica. Instead of relying on the companies to be honest, Mica acts like a super-strict, rule-following bouncer who controls every single door and window between the rooms.

Mica changes the rules of the game in three clever ways:

1. From "Implicit" to "Explicit" (The Guest List)

  • Old Way: "You can talk to whoever you want, as long as you don't get caught." (This relies on the software inside the room being perfect).
  • Mica Way: "You must submit a Guest List before you start."
    • You tell Mica: "I am only allowed to pass a package to Company B. I am NOT allowed to talk to the internet, and I am NOT allowed to talk to Company C."
    • Mica checks this list. If you try to pass a package to anyone not on the list, Mica slams the door shut immediately.

2. The "Chain of Custody" (The Inductive Step)

This is the coolest part. Mica doesn't just check your rules; it checks your neighbor's rules too.

  • If Company A says, "I only talk to B," Mica looks at Company B's rules.
  • If Company B says, "I only talk to C," Mica connects the dots.
  • Mica ensures that the data can flow from A → B → C, but cannot flow from A → B → The Internet.
  • Even if Company B is a "bad actor" who wants to leak data, Mica's rules prevent B from opening a door to the outside world. The data is trapped in the pipeline.

3. The "Group Passport" (Attestation)

Usually, to prove a room is secure, you check the lock on that specific door. But with Mica, you get a Group Passport.

  • Instead of checking 100 individual rooms, a remote verifier (like a bank or a government) can check one single document that proves: "Yes, these 100 rooms are connected exactly as we planned, and no secret data can escape this specific chain."
  • This proves the entire workflow is secure, not just the individual parts.

How It Works (The Analogy of the "Secure Pipeline")

Think of Mica as a specialized water pipe system for your data.

  • The Pipes (Memory): In the old days, pipes were open to the air. If a pipe leaked, the water (data) spilled out. Mica creates sealed, transparent pipes between the rooms. You can see the water flowing, but no one can touch it or steal it.
  • The Valves (Policies): Mica installs smart valves.
    • If the pipe goes from the "Encoder" to the "Moderator," the valve only allows water to flow forward.
    • If the "Encoder" tries to open a valve to the "Internet," the valve is welded shut.
  • The Inspector (Attestation): Before the water starts flowing, an inspector walks the whole line. They check every valve and every pipe connection. Once they sign off, you get a certificate saying, "This entire pipeline is leak-proof."

Why Is This a Big Deal?

  1. No More "Trust Me": You don't need to trust the software companies. You just need to trust the Mica system (which is small and simple). Even if the companies are rivals or have buggy code, they physically cannot leak your data because the pipes are sealed by Mica.
  2. Smaller "Trusted" List: In the old world, you had to trust every single company's entire software stack (their OS, their apps, etc.). With Mica, you only trust the Mica system itself. The companies can be messy; Mica keeps them in line.
  3. Real-World Use: The paper shows this works for things like:
    • Video Moderation: A video goes from a user → an encoder → a nudity detector → storage. The encoder can't steal the video, and the detector can't leak it.
    • AI Chatbots: A user sends a prompt to a filter, then to an AI, then back to a filter. The AI can't talk to the internet to steal the user's prompt, and the filter ensures the AI doesn't say anything dangerous.

Summary

Mica is like building a secure train line between different cities (computing components).

  • Before: You had to trust that every train station manager wouldn't let passengers jump off the train or steal the luggage.
  • With Mica: The tracks are enclosed in a glass tunnel. The train can only stop at the stations on the schedule. If a station manager tries to open a door to the outside world, the tunnel seals itself. You get a ticket (attestation) that proves the train stayed on the tracks the whole time.

It allows us to build complex, secure systems out of parts that don't trust each other, simply by enforcing strict, verifiable rules on how they can share information.