On Onboard LiDAR-based Flying Object Detection

Matouš Vrba, Viktor Walter, Václav Pritzl, Michal Pliska, Tomáš Báča, Vojtěch Spurný, Daniel Heřt, and Martin Saska - CTU in Prague

Abstract

A new robust and accurate approach for the detection and localization of flying objects with the purpose of highly dynamic aerial interception and agile multi-robot interaction is presented in this paper. The approach is proposed for use on board an autonomous aerial vehicle equipped with a 3D \ac{LiDAR} sensor. It relies on a novel 3D occupancy voxel mapping method for the target detection and a cluster-based multiple hypothesis tracker to compensate uncertainty of the sensory data. When compared to state-of-the-art methods of onboard detection of other flying objects, the presented approach provides superior localization accuracy and robustness to different environments and appearance changes of the target, as well as a greater detection range. Furthermore, in combination with the proposed multi-target tracker, sporadic false positives are suppressed, state estimation of the target is provided, and the detection latency is negligible. This makes the detector suitable for tasks of agile multi-robot interaction, such as autonomous aerial interception or formation control where precise, robust, and fast relative localization of other robots is crucial. We demonstrate the practical usability and performance of the system in simulated and real-world experiments.


Source code

UAV point cloud segmentation dataset

  • The labeled dataset is available here.
  • The dataset consists of 5457 full scans from the Ouster OS1-128 LiDAR split to training (4242 scans) and testing (1213 scans) sets. The scans are saved as Python numpy arrays using the pickle serialization library where each row of the array represents a single point with four elements: x, y, z, label. For background points, label = 0, for points corresponding to the UAV, label = 1.
Dataset properties
Split Name Environment Target Sensor Target range Number of scans
training cisar_2021 semi-urban MRS F450 OS0-128 close to far 1187
training cisar_2023 semi-urban MRS X500 OS1-128 medium to far 2442
training temesvar_2024_exp252 open field MRS T650 OS1-128 close to medium 613
testing temesvar_2024_exp259 open field MRS T650 OS1-128 close to medium 1213

Cite as

Matouš Vrba, Viktor Walter, Václav Pritzl, Michal Pliska, Tomáš Báča, Vojtěch Spurný, Daniel Heřt, and Martin Saska, "On Onboard LiDAR-based Flying Object Detection," IEEE Transactions on Robotics, vol. 41, pp 593-611, 2025.

@article{vrba2025OnboardLiDARBasedFlying,
  title = {On Onboard LiDAR-Based Flying Object Detection},
  author = {Vrba, Matouš and Walter, Viktor and Pritzl, Václav and Pliska, Michal and Báča, Tomáš and Spurný, Vojtěch and Heřt, Daniel and Saska, Martin},
  year = {2025},
  journal = {IEEE Transactions on Robotics},
  volume = {41},
  pages = {593--611},
  issn = {1941-0468},
  doi = {10.1109/TRO.2024.3502494}
}

Video summary of the paper


Videos from experiments

Simulated multi-UAV experiment


Real-world deployments

Eagle.One autonomous aerial interception system

  • The detection, tracking, and estimation system is used to implement the Eagle.One commercial AAIS platform.
  • State estimate of the tracked target provided by the system is used in feedback with an interception trajectory planner to autonomously, safely and reliably catch the target into a net carried by the interceptor.
  • More details regarding the state estimation, planning, and extensive experimental evaluation are presented in
    • Michal Pliska, Matouš Vrba, Tomáš Báča, Martin Saska, "Towards Safe Mid-Air Drone Interception: Strategies for Tracking & Capture", arXiv preprint arXiv:2405.13542, submitted to RA-L 2024 (after the first round of revisions). DOI10.48550/arXiv.2405.13542
  • Video from a preliminary experiment of an early prototype of the Eagle.One AAIS testing the detector and tracker:

  • Summary video of tests of interception trajectory planning algorithms using the detector and tracker in feedback:

  • Propagation video of the current state of the Eagle.One drone hunter:

Cooperative navigation in cluttered environments: Drones guiding drones

  • A cooperative navigation of a secondary, less-equipped micro aerial vehicle by a primary UAV equipped with a LiDAR and running the detection system is presented in
    • Václav Pritzl, Matouš Vrba, Yurii Stasinchuk, Vı́t Krátký, Jiřı́ Horyna, Petr Štěpán, and Martin Saska, "Drones Guiding Drones: Cooperative Navigation of a Less-Equipped Micro Aerial Vehicle in Cluttered Environments", accepted to IROS 2024 (more info here).
  • Relative position of the target detected by the system running onboard the primary UAV is used to correct drift of the secondary UAV's self-localization and to plan collision-free paths for both team-members within the environment that is mapped by the primary UAV using the LiDAR.
  • Video demonstrating the cooperative navigation in an outdoor exploration scenario: