Deep neural networks (DNNs) have achieved great success in the area of computer vision. The disparity estimation problem tends to be addressed by DNNs which achieve much better prediction accuracy than traditional hand-crafted feature-based methods. However, the existing DNNs hardly serve both efficient computation and rich expression capability, which makes them difficult for deployment in real-time and high-quality applications, especially on mobile devices. To this end, we propose an efficient, accurate, and configurable deep network for disparity estimation named FADNet++. Leveraging several liberal network design and training techniques, FADNet++ can boost its accuracy with a fast model inference speed for real-time applications. Besides, it enables users to easily configure different sizes of models for balancing accuracy and inference efficiency. We conduct extensive experiments to demonstrate the effectiveness of FADNet++ on both synthetic and realistic datasets among six GPU devices varying from server to mobile platforms. Experimental results show that FADNet++ and its variants achieve state-of-the-art prediction accuracy, and run at a significant order of magnitude faster speed than existing 3D models. With the constraint of running at above 15 frames per second (FPS) on a mobile GPU, FADNet++ achieves a new state-of-the-art result for the SceneFlow dataset.
翻译:深度神经网络(DNN)在计算机视觉领域取得了巨大成功。差异估计问题往往由DNN解决,因为DNN的预测准确性远高于传统的手工制作的地貌方法。然而,现有的DNNN几乎不能提供高效的计算和丰富的表达能力,使其难以在实时和高质量应用中部署,特别是在移动设备上。为此,我们建议建立一个高效、准确和可配置的深度差异估计网络,名为 FADNet+++。利用一些自由网络的设计和培训技术,FADNet++能够以快速模型推导速度提高它的准确性,实时应用的速度要快得多。此外,它使用户能够方便地配置不同大小的模型,以平衡准确性和推断效率。我们进行了广泛的实验,以展示FADNet++在从服务器到移动平台的6个GPU设备之间合成和现实的数据集的有效性。实验结果表明,FADNet+++及其变体能够实现最新预测准确性,并且以比现有的3D-FPM模型的第二级速度要快得多。我们用了一个新的G-D-D-D-D-FPFPF的硬盘在15上运行新的G-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-