ODD-SEC: Onboard Drone Detection with a Spinning Event Camera

This paper presents ODD-SEC, a real-time onboard drone detection system for moving carriers that utilizes a spinning event camera and a novel motion-compensation-free representation to achieve 360-degree surveillance with high accuracy under challenging conditions.

Kuan Dai, Hongxin Zhang, Sheng Zhong, Yi Zhou

Published 2026-03-09
📖 4 min read☕ Coffee break read

Imagine you are trying to spot a tiny, fast-moving drone in the sky while you are riding a bumpy, spinning merry-go-round. That is basically the challenge this paper solves.

Here is the story of ODD-SEC, a new system designed to catch drones even when the camera itself is moving and spinning.

1. The Problem: The "Static" Camera Trap

Most security cameras are like statues. They stand still and take pictures (frames) of the world.

  • The Issue: If a drone flies fast, a normal camera sees a blur. If the sun is too bright or it's too dark, the camera goes blind.
  • The "Event" Camera: Scientists invented a special camera called an Event Camera. Instead of taking full photos, it acts like a super-fast nervous system. It only "feels" when a pixel changes brightness. It's like a room full of people who only shout "Hey!" when something moves. This makes it incredibly fast and great in bright sun or total darkness.
  • The Catch: These event cameras usually have a narrow view (like looking through a straw). If a drone flies outside that straw, you miss it. Also, most systems assume the camera is standing still on a tripod. But what if the camera is on a robot dog running around? The movement creates chaos that breaks the software.

2. The Solution: The "Spinning Lighthouse"

The team built a system called ODD-SEC (Onboard Drone Detection with a Spinning Event Camera).

  • The Hardware: They took that narrow "straw" camera and mounted it on a motor that spins it around 360 degrees, like a lighthouse beam sweeping the horizon.
  • The Result: Instead of a narrow straw, the camera now sees a full panoramic circle around the robot. No matter which way the drone flies, it will eventually cross the camera's path.
  • The Challenge: Spinning a camera creates a massive amount of "motion blur" and data chaos. If you just feed this spinning data into a normal AI, it gets confused and thinks the whole world is moving, not just the camera.

3. The Secret Sauce: The "Time-Slice Sandwich"

To fix the spinning chaos, the team invented a clever way to process the data.

  • The Analogy: Imagine you are trying to watch a movie, but the projector is spinning wildly. The image is a mess.
  • The Fix: Instead of trying to watch the whole spinning mess at once, the system cuts the video into tiny, frozen slices of time (like slicing a loaf of bread).
  • The Magic: They feed these slices into a special AI brain (a modified version of a famous object detector called YOLOX). This AI has a special "Time Fusion" module. Think of this module as a conductor in an orchestra. It listens to the different slices, figures out which sounds (pixels) belong to the drone and which are just background noise caused by the spinning, and then harmonizes them to find the target.
  • No Motion Compensation: Usually, you have to mathematically "undo" the camera's spin to see clearly. This system is smart enough to learn the patterns of the spin and ignore them automatically, like a surfer who stays balanced on a moving wave without thinking about the wave's physics.

4. The Real-World Test: The Robot Dog

They didn't just test this in a lab. They put the system on a quadruped robot (a robot dog) and ran it outside.

  • The Setup: The robot dog ran around, and a DJI drone flew nearby. The camera spun 360 degrees on the dog's back.
  • The Conditions: They tested it in blinding sunlight, in the dark, and while the robot was jiggling and turning.
  • The Result: The system worked like a charm. It found the drone in real-time (about 22 times per second) and told the robot exactly where the drone was pointing (within 2 degrees of accuracy). That's like spotting a friend in a crowd and pointing your finger at them with almost perfect precision, even while you are running.

Why Does This Matter?

  • Security: It allows robots (like police dogs or security bots) to hunt down bad drones without needing a human to hold a camera.
  • Robustness: It works when normal cameras fail (too bright, too dark, too fast).
  • Mobility: It proves you can have a 360-degree "super-vision" system that works while moving, which is a huge step forward for autonomous robots.

In short: ODD-SEC is a spinning, super-fast eye on a robot dog that never gets dizzy, never gets blinded by the sun, and can spot a tiny drone flying anywhere around it, all while the robot is running a marathon.