Cognitive Warfare: Definition, Framework, and Case Study

This article proposes a unified definition of cognitive warfare and an OODA loop-based interaction framework to address current definitional inconsistencies, enabling joint force leaders and analysts to effectively assess, compare, and evaluate cognitive campaigns through measurable attributes of superiority.

Bonnie Rushing, William Hersch, Shouhuai Xu

Published 2026-03-06
📖 6 min read🧠 Deep dive

Here is an explanation of the paper "Cognitive Warfare: Definition, Framework, and Case Study," translated into simple, everyday language with creative analogies.

The Big Idea: It's Not About What You See, It's About How You Think

Imagine two people playing chess. In the old days, war was like a game of Checkers: you hit the other person's pieces with your pieces until they couldn't move anymore. That's physical warfare.

But today, the game has changed. The paper argues that modern conflict is less like Checkers and more like trying to convince your opponent that the board is actually a trampoline, that the black pieces are white, and that moving their King is a terrible idea.

This is Cognitive Warfare. It's not about destroying tanks or shooting missiles; it's about hacking the human mind to make bad decisions, freeze up, or act against their own interests.


The Problem: We Don't Know How to Measure It

Right now, the military and governments are confused about how to fight this. They often treat it like "Information Operations"—basically, just counting how many people saw a fake post or how many likes a video got.

The authors say: "That's the wrong scoreboard."
Just because a fake story went viral (high "likes") doesn't mean the enemy won. The real question is: Did it make the General hesitate? Did it make the troops panic? Did it cause them to make a mistake?

The paper tries to fix this by creating a new rulebook (a framework) to measure who is actually winning the war of the mind.


The New Rulebook: The "Multi-Horizon" Game

The authors introduce a framework based on a famous concept called the OODA Loop (Observe, Orient, Decide, Act). Think of the OODA Loop as the speed at which you can react to a surprise.

  • Observe: What do you see?
  • Orient: What does it mean?
  • Decide: What will you do?
  • Act: Do it.

The paper says cognitive warfare attacks this loop in two different timeframes, like a video game with two modes:

1. The "Acute" Mode (The Sudden Storm)

  • Timeframe: Minutes to days.
  • The Analogy: Imagine someone suddenly shouting "FIRE!" in a crowded theater.
  • The Attack: The enemy floods you with confusing noise, lies, or scary images right when you need to make a quick decision.
  • The Goal: Make you freeze, panic, or run the wrong way. They want to slow down your "Observe" and "Decide" steps so you miss your chance to act.

2. The "Chronic" Mode (The Slow Poison)

  • Timeframe: Weeks, months, or years.
  • The Analogy: Imagine someone slowly convincing you that the theater is actually a haunted house and that the fire alarm is a ghost.
  • The Attack: The enemy slowly erodes your trust in your leaders, your news, and your own judgment. They change your "settings" so that when the "FIRE!" shout happens later, you believe it immediately because you've been conditioned to distrust the truth.
  • The Goal: To re-program your brain so that even before the storm hits, you are already confused and vulnerable.

The Scoreboard: How Do We Know Who Wins?

The paper defines Cognitive Superiority not as "having the most followers," but as having the fastest, most resilient decision-making process.

Think of it like a Spam Filter vs. a Spammer:

  • The Spammer (Attacker): Tries to get their fake emails into your inbox. They don't need to convince you to buy everything; they just need to make you doubt the real emails so you miss a critical message.
  • The Filter (Defender): Tries to keep the real emails safe.

Who wins?

  • If the Spammer sends 1,000 emails for $10, and the Filter has to hire 10 people and spend $10,000 to sort them out, the Spammer has the advantage.
  • If the Spammer makes you delete a real, urgent email because you thought it was fake, they have Cognitive Superiority.

The paper says the winner is the side that can:

  1. Break the other side's decisions (make them slow or wrong).
  2. Keep their own decisions fast and clear.
  3. Do it cheaper and faster than the other side can fix it.

The Case Study: A Fictional War in "Norland"

To prove their point, the authors created a fake scenario:

  • The Setup: A US-led force (Blue) is helping a country called Norland. An enemy (Red) wants to stop them.
  • The Attack: Red creates a fake rumor that a US convoy accidentally hurt a local family. They flood social media with angry videos and fake news.
  • The Result:
    • Blue's Reaction: The US commanders get confused. They argue about whether the video is real. They delay moving their trucks. They hesitate to speak to the press.
    • Why Blue Lost: Red moved faster. Red's fake stories spread instantly. Blue had to wait for lawyers and fact-checkers to verify the truth. By the time Blue figured it out, Red had already convinced the local population that the US was dangerous.
  • The Lesson: Blue didn't lose because they were weak; they lost because they were slower and more expensive to coordinate than the enemy was to lie.

What Should We Do? (The Takeaway)

The paper suggests three big changes for the military and leaders:

  1. Stop counting "Likes": Don't measure success by how many people saw a message. Measure success by how fast and accurately your team can still make decisions when the enemy is lying.
  2. Train your brain like a muscle: Just as soldiers lift weights to carry heavy packs, they need to train their minds to spot lies and stay calm under pressure. This is called "Cognitive Resilience."
  3. Speed is everything: The military needs to get better at verifying facts and making decisions quickly. If the enemy can lie faster than you can tell the truth, you will lose the war of the mind.

In a Nutshell

Cognitive Warfare is the battle to control how people think and decide. The paper argues that the side that wins isn't the one with the biggest megaphone, but the one that can keep its own head clear and make good decisions while the other side is getting confused, scared, and paralyzed by lies. It's a game of speed, resilience, and cost, where the prize is the ability to act without fear or hesitation.