We present a novel real-time visual odometry framework for a stereo setup of a depth and high-resolution event camera. Our framework balances accuracy and robustness against computational efficiency towards strong performance in challenging scenarios. We extend conventional edge-based semi-dense visual odometry towards time-surface maps obtained from event streams. Semi-dense depth maps are generated by warping the corresponding depth values of the extrinsically calibrated depth camera. The tracking module updates the camera pose through efficient, geometric semi-dense 3D-2D edge alignment. Our approach is validated on both public and self-collected datasets captured under various conditions. We show that the proposed method performs comparable to state-of-the-art RGB-D camera-based alternatives in regular conditions, and eventually outperforms in challenging conditions such as high dynamics or low illumination.
翻译:我们为深度和高分辨率事件相机的立体设置提供了一个新型的实时直观观察测量框架。我们的框架平衡了精确度和稳健性,以适应在具有挑战性的情景下取得强效的计算效率。我们将常规的边缘半视视像测量方法推广到从事件流中获得的时间表地图上。通过扭曲外部校准深度相机的相应深度值,绘制了半高度深度地图。跟踪模块通过高效的、几何半常3D-2D边缘对齐来更新相机的构成。我们的方法在各种条件下采集的公开和自收集数据集中得到验证。我们显示,拟议方法在正常条件下与最先进的 RGB-D 相机替代方法相匹配,最终在高动态或低光度等具有挑战性的条件下出现超出。