We present a novel approach to joint depth and normal estimation for time-of-flight (ToF) sensors. Our model learns to predict the high-quality depth and normal maps jointly from ToF raw sensor data. To achieve this, we meticulously constructed the first large-scale dataset (named ToF-100) with paired raw ToF data and ground-truth high-resolution depth maps provided by an industrial depth camera. In addition, we also design a simple but effective framework for joint depth and normal estimation, applying a robust Chamfer loss via jittering to improve the performance of our model. Our experiments demonstrate that our proposed method can efficiently reconstruct high-resolution depth and normal maps and significantly outperforms state-of-the-art approaches. Our code and data will be available at \url{https://github.com/hkustVisionRr/JointlyDepthNormalEstimation}
翻译:我们提出了一个对飞行时间传感器进行联合深度和正常估计的新办法。我们的模型学会从 ToF 原始传感器数据中共同预测高质量的深度和正常地图。为了做到这一点,我们仔细地设计了第一个大型数据集(名为ToF-100),用工业深度摄像头提供的对齐原始ToF数据和地面-真实高分辨率深度地图。此外,我们还设计了一个简单而有效的联合深度和正常估计框架,通过抽取来对查费尔进行强大的损失,以改善我们的模型的性能。我们的实验表明,我们拟议的方法能够有效地重建高分辨率深度和正常地图,并大大超越最新的方法。我们的代码和数据将在以下网站查阅:https://github.com/hkustevicionRr/UighDeph NomalEstimation}