Lightweight 3D LiDAR-Based UAV Tracking: An Adaptive Extended Kalman Filtering Approach

This paper presents a lightweight 3D LiDAR-based UAV tracking system for small drones that utilizes an Adaptive Extended Kalman Filter to achieve robust, high-accuracy relative positioning in GPS-denied environments by dynamically adjusting to sparse and noisy point cloud data.

Nivand Khosravi, Meysam Basiri, Rodrigo Ventura

Published Wed, 11 Ma
📖 5 min read🧠 Deep dive

Imagine you are trying to play a game of "tag" in a pitch-black room, but instead of using your eyes, you are using a special flashlight that only sees in 3D dots (like a cloud of tiny stars). This is the challenge faced by drones (UAVs) trying to find and follow each other, especially when there is no GPS signal (like inside a building or a dense forest).

This paper presents a new, smart way for a "chaser" drone to track a "target" drone using a lightweight 3D laser scanner (LiDAR), even when the data is messy, incomplete, or the target is doing crazy acrobatic moves.

Here is the breakdown of their solution using simple analogies:

1. The Problem: The "Flickering Flashlight"

Most drones use cameras to see. But cameras are like human eyes: they fail in the dark, fog, or bright sun.

  • The Solution: They use LiDAR, which is like a flashlight that shoots out millions of tiny laser dots to build a 3D map. It works in total darkness.
  • The Catch: The specific laser scanner they use (the Livox Mid-360) is "non-repetitive." Imagine a sprinkler that doesn't spray water in neat, predictable circles, but instead sprays in random, shifting patterns.
    • Result: Sometimes the "target" drone looks like a solid ball of dots. Other times, it looks like a few scattered specks. Sometimes it disappears completely behind a tree (occlusion).
    • The Old Way: Traditional tracking systems are like a rigid robot. They assume the dots will always be in the same place and the same number. When the data gets messy, the robot gets confused and crashes (or loses the target).

2. The Solution: The "Smart Navigator" (Adaptive Kalman Filter)

The authors built a new tracking system called an Adaptive Extended Kalman Filter (AEKF). Think of this as a super-smart navigator who drives a car in heavy fog.

  • Standard Navigator (Old Method): This navigator drives assuming the fog is always the same thickness. If the fog suddenly gets thicker (bad data), the navigator keeps driving at the same speed and crashes because they didn't adjust.
  • The New Smart Navigator (AEKF): This navigator constantly asks, "How clear is the view right now?"
    • If the view is clear: It trusts the GPS (LiDAR data) and drives confidently.
    • If the view is blurry (sparse data): It says, "I'm not sure where the car is, so I'll slow down and rely more on my guess of where it should be based on its speed."
    • If the car disappears (occlusion): It doesn't panic. It keeps guessing the path based on the last known speed and direction, but it admits, "I'm getting less sure every second." As soon as the car reappears, it instantly snaps back to the correct path.

3. The "Magic Tricks" They Used

To make this work on a small, lightweight drone (which has limited battery and computer power), they added three special features:

  • The "Noise Filter" (Adaptive Noise): Imagine you are listening to a friend talk in a noisy bar. If the bar gets louder, you lean in closer and listen harder. If it gets quiet, you relax. The system does this mathematically. If the laser data is "noisy" (jittery), the system automatically trusts the drone's own movement predictions more. If the data is clean, it trusts the laser more.
  • The "Clump Finder" (Optimized Clustering): The laser sees thousands of dots. The system has to figure out which dots belong to the target drone and which belong to a bird or a tree branch. They used a smart sorting method (DBSCAN) that is tuned to find the "clump" of dots that looks like a drone, even if it's just a few scattered points.
  • The "Memory Lane" (Recovery Mechanism): If the target drone flies behind a building and disappears for a few seconds, the system doesn't give up. It uses a "memory lane" strategy. It keeps predicting where the drone should be. When the drone pops back out, the system gently guides the tracking back on course without getting confused.

4. The Results: The Race

They tested this on two real drones flying around. One had the laser scanner, and the other was the target doing aggressive, fast turns.

  • The Old Way (Standard Filter): When the target flew fast or went out of range, the tracker got lost. It thought the drone was 50 meters away when it was actually right next to it. It was like a drunk driver swerving off the road.
  • The "Particle Filter" (Another Method): This method was okay, but it was "jittery." It looked like a shaky camera recording. It could find the drone but couldn't draw a smooth line.
  • The New Way (AEKF): This was the winner. It tracked the drone smoothly, even during sharp turns and when the drone briefly disappeared. It was 49% more accurate than the next best method and didn't crash the drone's computer (it used very little battery and processing power).

The Bottom Line

This paper shows that you don't need expensive, heavy equipment to track drones in the dark or without GPS. By using a "smart" math system that adapts to bad weather and messy data in real-time, small drones can now safely fly in swarms, avoid collisions, and find each other even when the world is chaotic and unpredictable.

In short: They taught a small drone to be a better detective by giving it a brain that knows when to trust its eyes and when to trust its instincts.