Supervised learning with deep convolutional neural networks (DCNNs) has seen huge adoption in stereo matching. However, the acquisition of large-scale datasets with well-labeled ground truth is cumbersome and labor-intensive, making supervised learning-based approaches often hard to implement in practice. To overcome this drawback, we propose a robust and effective self-supervised stereo matching approach, consisting of a pyramid voting module (PVM) and a novel DCNN architecture, referred to as OptStereo. Specifically, our OptStereo first builds multi-scale cost volumes, and then adopts a recurrent unit to iteratively update disparity estimations at high resolution; while our PVM can generate reliable semi-dense disparity images, which can be employed to supervise OptStereo training. Furthermore, we publish the HKUST-Drive dataset, a large-scale synthetic stereo dataset, collected under different illumination and weather conditions for research purposes. Extensive experimental results demonstrate the effectiveness and efficiency of our self-supervised stereo matching approach on the KITTI Stereo benchmarks and our HKUST-Drive dataset. PVStereo, our best-performing implementation, greatly outperforms all other state-of-the-art self-supervised stereo matching approaches. Our project page is available at sites.google.com/view/pvstereo.
翻译:与深层进化神经网络(DCNNS)的监督下学习在监督下,在立体配对中获得了巨大的采纳。然而,获得具有贴上标签的地面真理的大型数据集是繁琐和劳动密集型的,使得监督的学习方法往往难以在实践中实施。为了克服这一缺陷,我们提议了一种由金字塔投票模块(PVM)和被称为OptStereo的新型DCNNS结构组成的强大和有效的自我监督立体匹配方法。具体地说,我们的OptStereo首先建立多尺度的成本量,然后采用一个经常性单位来迭代更新高分辨率的差异估计;而我们的PVM可以产生可靠的半临界差异图像,用来监督OptStereo培训。此外,我们出版了HKUST-Drive数据集,一个大型合成立体数据集,在不同的污染和气候条件下收集,用于研究目的。广泛的实验结果表明,我们在KITTI Stesterero基准和我们HKust-Dribe-Developreal Supal Setty Pat-Settard Pat-stal-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Stagal-Set-Set-Sta-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-Set-