In this paper, we present a Computer Vision (CV) based tracking and fusion algorithm, dedicated to a 3D printed gimbal system on drones operating in nature. The whole gimbal system can stabilize the camera orientation robustly in a challenging nature scenario by using skyline and ground plane as references. Our main contributions are the following: a) a light-weight Resnet-18 backbone network model was trained from scratch, and deployed onto the Jetson Nano platform to segment the image into binary parts (ground and sky); b) our geometry assumption from nature cues delivers the potential for robust visual tracking by using the skyline and ground plane as a reference; c) a spherical surface-based adaptive particle sampling, can fuse orientation from multiple sensor sources flexibly. The whole algorithm pipeline is tested on our customized gimbal module including Jetson and other hardware components. The experiments were performed on top of a building in the real landscape.
翻译:在本文中,我们展示了一个基于计算机视野的跟踪和聚合算法,专门用于3D印刷的无人机无人机自然操作的吉卜勒系统。整个 gimbal 系统可以通过使用天线和地面飞机作为参考,在具有挑战性的自然情景下强有力地稳定相机方向。我们的主要贡献如下:(a) 一个轻型Resnet-18主干网模型是从零开始训练的,并部署在杰特森纳诺平台上,将图像分为二元部分(地面和天空);(b) 我们从自然线索所作的几何假设提供了利用天线和地面平面作为参照进行强力视觉跟踪的潜力;(c) 一个基于球形表面的适应性粒子取样,能够灵活地从多个传感器源导出导出方向。整个算法管道是通过我们定制的Gimbal模块,包括Jetson和其他硬件组件进行测试的。实验是在真实地貌建筑的顶部进行。