A Consensus-Bayesian Framework for Detecting Malicious Activity in Enterprise Directory Access Graphs

This paper proposes a consensus-based Bayesian framework that models enterprise directory access as a multi-level interaction graph to detect malicious activity by identifying logical perturbations in strongly connected components through opinion dynamics and time-evolving anomaly scoring.

Pratyush Uppuluri, Shilpa Noushad, Sajan Kumar

Published 2026-03-05
📖 4 min read☕ Coffee break read

Imagine a massive, bustling office building (the Enterprise) where thousands of employees (Users) constantly move between different rooms, filing cabinets, and digital lockers (Directories).

Normally, people stick to their own teams. The marketing team hangs out in the Marketing room; the engineers stay in the Engineering lab. They know who they trust, who they work with, and what files they usually touch. This is the "normal" flow of the building.

But what happens when a spy (a Malicious Actor) sneaks in? Or when an employee gets confused and starts wandering into rooms they've never entered before? Traditional security guards just look at the ID badges (encryption) or check the logs after the crime happens (forensics). This paper proposes a smarter way: a "Gossip Network" that spots trouble before it spreads.

Here is how the authors' system works, broken down into simple concepts:

1. The "Gossip" Map (Opinion Dynamics)

Think of every employee as having an "opinion" about what they should be doing.

  • The Normal State: In a healthy team, everyone agrees on the plan. If the Marketing team decides to work on a specific project, they all move in sync. Their "opinions" converge.
  • The Graph: The system draws a map showing who influences whom. If Alice usually asks Bob for help, there's a line connecting them. If Bob suddenly stops listening to Alice and starts listening to a stranger, the map changes.

2. The "Trust Circles" (Strongly Connected Components)

The system groups people into tight-knit circles (called SCCs).

  • Closed Circles: These are teams that only talk to each other. They have their own internal rules. If they all agree, they are stable.
  • Open Circles: These are teams that take advice from outside.
  • The Rule: As long as everyone in a circle follows the same "logic" (e.g., "We only open files from the server"), the group stays calm and stable.

3. The "Whistleblower" (Detecting the Anomaly)

The magic happens when someone breaks the rules.

  • The Scenario: Imagine a hacker (or a confused employee) starts accessing files they shouldn't. Suddenly, User A (who usually only talks to User B) starts taking orders from User Z (a stranger in a different department).
  • The Shift: This changes the "logic" of the group. The group's internal agreement breaks. Instead of everyone moving in harmony, they start arguing or moving in different directions.
  • The Signal: The system measures Variance. Think of this as the "noise level" in the room.
    • Normal: Low noise. Everyone is humming the same tune.
    • Anomaly: High noise. Someone is screaming a different song. The system sees this spike in "noise" (variance) and knows something is wrong.

4. The "Smart Detective" (Bayesian Scoring)

The system doesn't just scream "ALARM!" the second it hears a noise. It acts like a smart detective using Bayesian Logic (a way of updating beliefs based on new evidence).

  • The Prior: The detective starts with a hunch: "It's probably safe, but I'm watching." (Low probability of a crime).
  • The Evidence: Every time the "noise" (variance) gets louder, the detective updates their hunch.
    • Small noise: "Maybe just a misunderstanding." (Probability goes up slightly).
    • Loud, persistent noise: "Okay, this is definitely a break-in!" (Probability shoots up to 99%).
  • The Result: The system gives a score from 0 to 100%. If the score gets too high, it flags the user as malicious.

5. Why This is Better Than Old Methods

  • Old Way: "You accessed a file you never touched before. Arrest you!" (Too many false alarms; people change jobs and need new files).
  • This Way: "You accessed a new file, BUT you also stopped listening to your team, started listening to a stranger, and your whole team is now confused. That's suspicious."
  • It looks at the relationships, not just the single action. It understands that a team moving together is normal, but a team falling apart because of one person is a red flag.

The Big Picture

This paper builds a digital immune system for companies. Instead of just checking if a key fits a lock, it watches how the "cells" (users) interact. If a cell starts acting weird and infecting its neighbors, the system detects the "infection" (malicious activity) instantly by measuring how much the group's harmony is disrupted.

In short: It turns the chaotic noise of a busy office into a clear signal, spotting the bad guys by the way they disrupt the team's rhythm.