The development of aerial autonomy has enabled aerial robots to fly agilely in complex environments. However, dodging fast-moving objects in flight remains a challenge, limiting the further application of unmanned aerial vehicles (UAVs). The bottleneck of solving this problem is the accurate perception of rapid dynamic objects. Recently, event cameras have shown great potential in solving this problem. This paper presents a complete perception system including ego-motion compensation, object detection, and trajectory prediction for fast-moving dynamic objects with low latency and high precision. Firstly, we propose an accurate ego-motion compensation algorithm by considering both rotational and translational motion for more robust object detection. Then, for dynamic object detection, an event camera-based efficient regression algorithm is designed. Finally, we propose an optimizationbased approach that asynchronously fuses event and depth cameras for trajectory prediction. Extensive real-world experiments and benchmarks are performed to validate our framework. Moreover, our code will be released to benefit related researches.
翻译:空中自主的发展使空中机器人能够在复杂的环境中灵活飞行。然而,在飞行中躲避快速移动的物体仍是一项挑战,限制了无人驾驶飞行器的进一步应用。解决这一问题的瓶颈在于对快速动态物体的准确认识。最近,事件摄影机在解决这一问题方面表现出巨大的潜力。本文件展示了一种完整的感知系统,包括自我移动补偿、物体探测以及低潜度和高精确度快速移动动态物体的轨迹预测。首先,我们提出一种精确的自我升降补偿算法,方法是考虑旋转和翻译两种动作,以便进行更强有力的物体探测。然后,为动态物体探测设计一种基于事件摄像头的有效回归算法。最后,我们提出一种基于优化的方法,即无休止的引信事件和轨道预测的深度摄像头。为了验证我们的框架,将进行广泛的现实世界实验和基准。此外,我们将发布我们的代码,以利相关研究。