Localization without Global Navigation Satellite Systems (GNSS) is a critical functionality in autonomous operations of unmanned aerial vehicles (UAVs). Vision-based localization on a known map can be an effective solution, but it is burdened by two main problems: places have different appearance depending on weather and season, and the perspective discrepancy between the UAV camera image and the map make matching hard. In this work, we propose a localization solution relying on matching of UAV camera images to georeferenced orthophotos with a trained convolutional neural network model that is invariant to significant seasonal appearance difference (winter-summer) between the camera image and map. We compare the convergence speed and localization accuracy of our solution to six reference methods. The results show major improvements with respect to reference methods, especially under high seasonal variation. We finally demonstrate the ability of the method to successfully localize a real UAV, showing that the proposed method is robust to perspective changes.
翻译:没有全球导航卫星系统(GNSS)的本地化是无人驾驶航空器(UAVs)自主运行中的一个关键功能。已知地图上的基于愿景的本地化可以是一种有效的解决办法,但主要有两个问题对此造成负担:天气和季节不同,无人驾驶航空器相机图像与地图之间的视角差异使匹配变得十分艰难。在这项工作中,我们提出了一个本地化解决方案,其依据是将无人驾驶航空器相机图像与地理参照的正方位光谱相匹配,并有一个经过训练的共生神经网络模型,该模型与相机图像和地图之间重大的季节性外观差异(交叠)不相容。我们将我们解决方案的趋同速度和本地化精度与六种参考方法进行比较。结果显示,参考方法方面有重大改进,特别是在高度季节性变异的情况下。我们最后展示了将真正的无人驾驶航空器成功本地化的方法的能力,表明拟议方法对视角变化是强大的。