Multirotor teams are useful for inspection, delivery, and construction tasks, in which they might be required to fly very close to each other. In such close-proximity cases, nonlinear aerodynamic effects can cause catastrophic crashes, necessitating each robots' awareness of the surroundings. Existing approaches rely on expensive or heavy perception sensors. Instead, we propose to use the often ignored yaw degree-of-freedom of multirotors to spin a single, cheap and lightweight monocular camera at a high angular rate for omnidirectional awareness. We provide a dataset collected with real-world physical flights as well as with 3D rendered scenes and compare two existing learning-based methods in different settings with respect to success rate, relative position estimation, and downwash prediction accuracy. As application we demonstrate that our proposed spinning camera is capable of predicting the presence of aerodynamic downwash in a challenging swapping task.
翻译:多机器人团队可用于检查、交付和施工任务,其中可能要求它们彼此飞近距离。在近距离情况下,非线性空气动力效应可能导致灾难性撞车,需要每个机器人了解周围环境。现有方法依赖于昂贵或重度感知传感器。相反,我们提议使用通常被忽视的多机器人的亚湿度自由度来旋转一个单一的、廉价的和轻型的单筒照相机,其旋转速度要高角,以引起全局感知。我们提供了一套用现实世界物理飞行收集的数据集,以及3D提供的场景,并在不同的环境下比较了两种基于学习的现有方法,即成功率、相对位置估计和低潮预测准确性。我们的应用表明,我们拟议的旋转相机能够预测在具有挑战性的互换任务中是否存在有氧动力的低潮。</s>