With the advanced request to employ a team of robots to perform a task collaboratively, the research community has become increasingly interested in collaborative simultaneous localization and mapping. Unfortunately, existing datasets are limited in the scale and variation of the collaborative trajectories, even though generalization between inter-trajectories among different agents is crucial to the overall viability of collaborative tasks. To help align the research community's contributions with realistic multiagent ordinated SLAM problems, we propose S3E, a large-scale multimodal dataset captured by a fleet of unmanned ground vehicles along four designed collaborative trajectory paradigms. S3E consists of 7 outdoor and 5 indoor sequences that each exceed 200 seconds, consisting of well temporal synchronized and spatial calibrated high-frequency IMU, high-quality stereo camera, and 360 degree LiDAR data. Crucially, our effort exceeds previous attempts regarding dataset size, scene variability, and complexity. It has 4x as much average recording time as the pioneering EuRoC dataset. We also provide careful dataset analysis as well as baselines for collaborative SLAM and single counterparts. Data and more up-to-date details are found at https://github.com/PengYu-Team/S3E.
翻译:由于预先要求雇用一组机器人来协同执行任务,研究界对同时进行本地化和绘图的协作越来越感兴趣,不幸的是,现有的数据集在协作轨道的规模和变化方面有限,尽管在不同代理人之间对轨道之间的一般化对于协作任务的总体可行性至关重要。为了帮助研究界的贡献与现实的多试剂特大SLAM问题相协调,我们提议S3E,由一支无人驾驶地面飞行器组成的车队按照四个设计的合作轨迹模式捕获的大型多式联运数据集。S3E由7个室外和5个室内序列组成,每超过200秒,其中包括时间同步和空间校准高频IMU、高质量立体照相机和360度LIDAR数据。很显然,我们的努力超过了以往关于数据集大小、场面变异性和复杂性的尝试。我们平均有4x个记录时间作为开创性的 EuRoC数据集。我们还提供谨慎的数据设置分析,以及合作性SLM和单一对应方的基准。在 http:// http://www/Yng/SUBTe。