We introduce M2DGR: a novel large-scale dataset collected by a ground robot with a full sensor-suite including six fish-eye and one sky-pointing RGB cameras, an infrared camera, an event camera, a Visual-Inertial Sensor (VI-sensor), an inertial measurement unit (IMU), a LiDAR, a consumer-grade Global Navigation Satellite System (GNSS) receiver and a GNSS-IMU navigation system with real-time kinematic (RTK) signals. All those sensors were well-calibrated and synchronized, and their data were recorded simultaneously. The ground truth trajectories were obtained by the motion capture device, a laser 3D tracker, and an RTK receiver. The dataset comprises 36 sequences (about 1TB) captured in diverse scenarios including both indoor and outdoor environments. We evaluate state-of-the-art SLAM algorithms on M2DGR. Results show that existing solutions perform poorly in some scenarios. For the benefit of the research community, we make the dataset and tools public. The webpage of our project is https://github.com/SJTU-ViSYS/M2DGR.
翻译:我们引入了M2DGR:由地面机器人收集的新颖的大型数据集,配有全传感器,包括6个鱼眼和1个天空定位RGB照相机、红外摄像机、事件相机、视觉内感应仪(VI-传感器)、惯性测量单元(IMU)、LiDAR、消费者级全球导航卫星系统接收器和带有实时动能信号的GNS-IMU导航系统。所有这些传感器都经过良好校正和同步,同时记录了它们的数据。通过运动捕捉装置、激光3D追踪器和RTK接收器获得了地面真实截图。这些数据集包含36个序列(约1TB),在包括室内和室外环境在内的不同情景中捕捉。我们评估M2DGR上最先进的SLAM算法。结果显示,在有些情景中,现有解决方案效果很差。为了研究界的利益,我们把数据设置和工具公开了。我们项目的网页是 https://gius/MTU.SUB.com。