Recent research has enabled fixed-wing unmanned aerial vehicles (UAVs) to maneuver in constrained spaces through the use of direct nonlinear model predictive control (NMPC). However, this approach has been limited to a priori known maps and ground truth state measurements. In this paper, we present a direct NMPC approach that leverages NanoMap, a light-weight point-cloud mapping framework to generate collision-free trajectories using onboard stereo vision. We first explore our approach in simulation and demonstrate that our algorithm is sufficient to enable vision-based navigation in urban environments. We then demonstrate our approach in hardware using a 42-inch fixed-wing UAV and show that our motion planning algorithm is capable of navigating around a building using a minimalistic set of goal-points. We also show that storing a point-cloud history is important for navigating these types of constrained environments.
翻译:最近的研究使固定翼无人驾驶飞行器(无人驾驶飞行器)能够通过使用直接的非线性模型预测控制(NMPC)在有限的空间中操纵。然而,这一方法仅限于先天已知的地图和地面真相状态测量。在本文中,我们展示了一种直接的NMPC方法,即利用轻量级点心阵列绘图框架NanoMap,利用机载立体立体愿景生成无碰撞轨迹。我们首先探索了我们的模拟方法,并表明我们的算法足以在城市环境中进行基于愿景的导航。然后我们用42英寸固定翼无人驾驶飞行器展示了我们的硬件方法,并展示了我们的行动规划算法能够利用一套最起码的目标点来绕过一座建筑。我们还表明,储存点球历史对于在这种受限的环境中航行很重要。