Despite significant academic and corporate efforts, autonomous driving under adverse visual conditions still proves challenging. As neuromorphic technology has matured, its application to robotics and autonomous vehicle systems has become an area of active research. Low-light and latency-demanding situations can benefit. To enable event cameras to operate alongside staple sensors like lidar in perception tasks, we propose a direct, temporally-decoupled calibration method between event cameras and lidars. The high dynamic range and low-light operation of event cameras are exploited to directly register lidar laser returns, allowing information-based correlation methods to optimize for the 6-DoF extrinsic calibration between the two sensors. This paper presents the first direct calibration method between event cameras and lidars, removing dependencies on frame-based camera intermediaries and/or highly-accurate hand measurements. Code will be made publicly available.
翻译:尽管在学术和企业方面做出了重大努力,但在负面视觉条件下的自主驾驶仍然具有挑战性。随着神经形态技术的成熟,对机器人和自主车辆系统的应用已成为一个积极研究的领域。低光和悬浮需求情况可能有益。为了使事件摄像机能够与主传感器(如Lidar)一起运行,我们提议在事件摄像机和Lidars之间采用一种直接的、暂时脱钩的校准方法。事件摄像机的高动态射程和低光操作被利用来直接登记激光返回,允许基于信息的相关方法优化两个传感器之间的6-DoF外部校准。本文介绍了事件摄像机和Lidars之间的第一种直接校准方法,消除了基于框架的相机中间人和/或高度精确的手测量之间的依赖性。代码将公布于众。