The integration of multiple cameras and 3D Li- DARs has become basic configuration of augmented reality devices, robotics, and autonomous vehicles. The calibration of multi-modal sensors is crucial for a system to properly function, but it remains tedious and impractical for mass production. Moreover, most devices require re-calibration after usage for certain period of time. In this paper, we propose a single-shot solution for calibrating extrinsic transformations among multiple cameras and 3D LiDARs. We establish a panoramic infrastructure, in which a camera or LiDAR can be robustly localized using data from single frame. Experiments are conducted on three devices with different camera-LiDAR configurations, showing that our approach achieved comparable calibration accuracy with the state-of-the-art approaches but with much greater efficiency.
翻译:多摄像头和3D Li- DARs的集成已成为增强的真象装置、机器人和自主飞行器的基本配置。多式传感器的校准对于系统正常运行至关重要,但对于大规模生产来说仍然是乏味和不切实际的。此外,大多数装置在使用一段时间后需要重新校准。在本文中,我们建议了用于校准多相机和3D LiDARs之间外部变异的单发解决方案。我们建立了一个全景基础设施,其中照相机或LIDAR能够使用单一框架的数据进行稳健的本地化。在三个装置上进行了实验,而不同的相机- LiDAR配置不同,这表明我们的方法达到了与最先进的方法相当的校准精度,但效率更高。