Self-supervised monocular depth prediction provides a cost-effective solution to obtain the 3D location of each pixel. However, the existing approaches usually lead to unsatisfactory accuracy, which is critical for autonomous robots. In this paper, we propose a novel two-stage network to advance the self-supervised monocular dense depth learning by leveraging low-cost sparse (e.g. 4-beam) LiDAR. Unlike the existing methods that use sparse LiDAR mainly in a manner of time-consuming iterative post-processing, our model fuses monocular image features and sparse LiDAR features to predict initial depth maps. Then, an efficient feed-forward refine network is further designed to correct the errors in these initial depth maps in pseudo-3D space with real-time performance. Extensive experiments show that our proposed model significantly outperforms all the state-of-the-art self-supervised methods, as well as the sparse-LiDAR-based methods on both self-supervised monocular depth prediction and completion tasks. With the accurate dense depth prediction, our model outperforms the state-of-the-art sparse-LiDAR-based method (Pseudo-LiDAR++) by more than 68% for the downstream task monocular 3D object detection on the KITTI Leaderboard.
翻译:自我监督的单心深度预测提供了一种低成本有效的解决方案,以获得每个像素的 3D 位置。 但是, 现有方法通常会导致不满意的准确性, 这对于自主机器人至关重要。 在本文中, 我们提出一个新的两阶段网络, 通过利用低成本稀薄(如4光束)的激光雷达, 推进自监督的单心深度学习。 与目前主要以耗时的迭接后处理、 我们的模型引信单眼图像特征 和稀疏的 LiDAR 特征 来预测初始深度地图的方式使用稀疏的LIDAR 方法不同。 然后, 一个高效的饲料前向改进网络进一步设计, 以实时性能纠正伪3D 空间这些初始深度地图中的错误。 广泛的实验显示, 我们拟议的模型大大超越了所有最先进的自我监督方法, 以及基于稀薄的LiDAR 方法, 以自我监督的单心深度预测和完成任务。 随着精确的深度预测, 我们的模型超越了在假的3AR- D 底点探测器上, 以KAR- drow- droad the st- develop ex drevor- develop drevabal- droad thest- droad thest- drofal- droforpal- droforpal- drofor- drofor- drofor- drofor- drofor- drofal- drofal- drofor- drofor- drod- drod- drod- drod- drod- drod- drod- drod- drod- drod- drod- drod- srod- drod- drofroutd- droutd- drod- drod- drod- rod- sald- sld- sal- sal- legald- sal- sald- drod- drod- drod- drod- drod- drod- drod- led- led- led- led- drod- le- drod- drod-