Accurate self and relative state estimation are the critical preconditions for completing swarm tasks, e.g., collaborative autonomous exploration, target tracking, search and rescue. This paper proposes a fully decentralized state estimation method for aerial swarm systems, in which each drone performs precise ego-state estimation, exchanges ego-state and mutual observation information by wireless communication, and estimates relative state with respect to (w.r.t.) the rest of UAVs, all in real-time and only based on LiDAR-inertial measurements. A novel 3D LiDAR-based drone detection, identification and tracking method is proposed to obtain observations of teammate drones. The mutual observation measurements are then tightly-coupled with IMU and LiDAR measurements to perform real-time and accurate estimation of ego-state and relative state jointly. Extensive real-world experiments show the broad adaptability to complicated scenarios, including GPS-denied scenes, degenerate scenes for camera (dark night) or LiDAR (facing a single wall). Compared with ground-truth provided by motion capture system, the result shows the centimeter-level localization accuracy which outperforms other state-of-the-art LiDAR-inertial odometry for single UAV system.
翻译:准确的自我和相对状态估算是完成群落任务的关键先决条件,例如合作自主勘探、目标跟踪、搜索和救援。本文件建议对空中群群系统采用完全分散的国家估算方法,其中每个无人驾驶飞机都进行精确的自我状态估算,通过无线通信交换自我状态和相互观测信息,并估计(w.r.t.)其他无人驾驶飞行器的相对状态,所有这些都是实时的,而且仅以LIDAR-内皮测量为基础。提出了一个新的3DLIDAR型无人驾驶飞机探测、识别和跟踪方法,以获得对团队型无人机的观测。然后,将相互观测测量与IMU和LIDAR的测量紧密结合,以同时进行实时和准确的自我状态和相对状态估算。广泛的现实世界实验显示,对复杂情景,包括全球定位系统覆盖的场景、照相机(dark之夜)或LIDAR(形成单面墙)的变形场景,与移动式采集系统提供的地面跟踪、识别和跟踪方法相比,结果显示,当地测地轨道系统比出其他状态。