A 360-degree Multi-camera System for Blue Emergency Light Detection Using Color Attention RT-DETR and the ABLDataset

This paper presents a 360-degree multi-camera system for detecting blue emergency lights that utilizes a curated ABLDataset and an enhanced RT-DETR model with a color attention block to achieve high accuracy, azimuthal localization, and approach angle estimation for integration into Advanced Driver Assistance Systems.

Francisco Vacalebri-Lloret, Lucas Banchero, Jose J. Lopez, Jose M. Mossi

Published 2026-03-06
📖 5 min read🧠 Deep dive

Imagine you are driving down a busy street. You are focused on the road ahead, checking your mirrors, and listening to the radio. Suddenly, an ambulance or fire truck is rushing toward you, but you don't hear its siren yet, or maybe it's too loud outside to hear it clearly. In the old days, you'd have to rely on luck or a sudden flash of blue light to realize, "Oh no, I need to move!"

This paper is about building a super-powered "sixth sense" for your car that acts like a vigilant co-pilot, constantly scanning 360 degrees around the vehicle to spot emergency vehicles before they become a danger.

Here is the breakdown of how they did it, using some everyday analogies:

1. The Eyes: The "Owl's Gaze" (360° Cameras)

Most cars have a few cameras, but this system uses four fish-eye cameras mounted on the front, back, and sides.

  • The Analogy: Think of a normal camera as a human eye looking straight ahead. A fish-eye lens is like an owl's head that can swivel 180 degrees without moving its body. By placing four of these "owls" on the car, the system creates a perfect, unbroken circle of vision. It sees everything behind you, beside you, and in front of you simultaneously, just like a security guard with a spinning head.

2. The Brain: The "Color-Sniffing Detective" (AI Model)

The car needs a brain to figure out what it's seeing. The researchers tested many different "brains" (AI models like YOLO and Faster R-CNN), but they found that one called RT-DETR was the best at the job.

  • The Upgrade: They realized that standard AI sometimes gets confused by white lights or reflections. So, they gave this AI a special pair of glasses called a "Color Attention Block."
  • The Analogy: Imagine you are looking for a specific red balloon in a sea of white clouds. A normal person might get distracted by the white clouds. But if you put on red-tinted glasses, the white clouds fade away, and the red balloon pops out instantly. That's what this "Color Attention" does: it tells the AI, "Ignore everything that isn't blue or bluish-white. Focus only on the emergency lights."

3. The Training: The "YouTube Detective" (The Dataset)

To teach this AI, you need a massive library of pictures. But there was a problem: no one had a library of just flashing blue emergency lights.

  • The Solution: The team became "detectives" on YouTube. They hunted down videos made by "emergency vehicle spotters" (people who love filming ambulances and police cars). They downloaded thousands of clips and manually drew boxes around every single blue light they could find.
  • The Result: They created a new, custom textbook called the ABLDataset. It's like a specialized training manual that teaches the AI to recognize blue lights in rain, snow, day, and night, regardless of what country the car is in.

4. The Map: The "Compass" (Calibration)

Knowing that there is an emergency vehicle is good, but knowing where it is is better. The system doesn't just say "Emergency vehicle detected!"; it says "Emergency vehicle detected 45 degrees to your left."

  • How it works: They used a mathematical process called calibration to map the distorted fish-eye images onto a flat map of the road.
  • The Analogy: Imagine looking at a reflection in a funhouse mirror. It's distorted and curved. The system takes that curved image and "unfolds" it mathematically to tell the driver exactly where the danger is coming from, like a compass pointing directly at the threat.

5. The Results: The "Night Owl" vs. The "Daytime Eagle"

They tested the system in the real world, driving cars with fake emergency lights.

  • Daytime: The system is like an eagle, spotting lights clearly up to 30 meters (about 100 feet) away.
  • Nighttime: It gets even better! Because the blue lights contrast sharply against the dark, the system can spot them from up to 70 meters (over 200 feet) away.
  • The Magic: Even if the siren is off, or the vehicle is far away, this system spots the blue light and alerts the driver instantly.

Why Does This Matter?

Think of this system as a safety net.

  • For the Driver: It gives you extra time to react. Instead of swerving at the last second, you get a gentle nudge on your dashboard screen saying, "Ambulance coming from the left," giving you time to move over calmly.
  • For the Emergency Team: It helps them get through traffic faster and safer, potentially saving lives.
  • For the Future: This is a building block for self-driving cars. If a car can "see" and "understand" emergency lights, it can automatically pull over, making our roads safer for everyone.

In a nutshell: The researchers built a 360-degree "owl-eye" vision system, taught it to ignore distractions using "red-tinted glasses" (color attention), and trained it on a massive library of YouTube videos. The result is a car that never misses a flashing blue light, no matter where it is or what time of day it is.