This paper presents a multimodal indoor odometry dataset, OdomBeyondVision, featuring multiple sensors across the different spectrum and collected with different mobile platforms. Not only does OdomBeyondVision contain the traditional navigation sensors, sensors such as IMUs, mechanical LiDAR, RGBD camera, it also includes several emerging sensors such as the single-chip mmWave radar, LWIR thermal camera and solid-state LiDAR. With the above sensors on UAV, UGV and handheld platforms, we respectively recorded the multimodal odometry data and their movement trajectories in various indoor scenes and different illumination conditions. We release the exemplar radar, radar-inertial and thermal-inertial odometry implementations to demonstrate their results for future works to compare against and improve upon. The full dataset including toolkit and documentation is publicly available at: https://github.com/MAPS-Lab/OdomBeyondVision.
翻译:本文介绍一个多式室内测量数据集,Odom Beyound Vision,由不同频谱的多个传感器组成,并用不同的移动平台收集;Odom Beyound Vision不仅包含传统导航传感器,如IMUs、机械激光雷达、RGBD摄像机等传感器,而且还包括若干新兴传感器,如单芯毫米Wave雷达、LWIR热摄像头和固态激光雷达。上述传感器在UAV、UGV和手持平台上都有,我们分别记录了多式眼测量数据及其在不同室内场景和不同照明条件下的移动轨迹;我们发布前雷达、雷达-肾脏和热-肾脏测量仪,以展示其未来工作的结果,以比较和改进。包括工具包和文件在内的完整数据集可在以下网站公开查阅:https://github.com/MAPS-Lab/Odombeyon Vision。