Underwater images are altered by the physical characteristics of the medium through which light rays pass before reaching the optical sensor. Scattering and wavelength-dependent absorption significantly modify the captured colors depending on the distance of observed elements to the image plane. In this paper, we aim to recover an image of the scene as if the water had no effect on light propagation. We introduce SUCRe, a new method that exploits the scene's 3D structure for underwater color restoration. By following points in multiple images and tracking their intensities at different distances to the sensor, we constrain the optimization of the parameters in an underwater image formation model and retrieve unattenuated pixel intensities. We conduct extensive quantitative and qualitative analyses of our approach in a variety of scenarios ranging from natural light to deep-sea environments using three underwater datasets acquired from real-world scenarios and one synthetic dataset. We also compare the performance of the proposed approach with that of a wide range of existing state-of-the-art methods. The results demonstrate a consistent benefit of exploiting multiple views across a spectrum of objective metrics. Our code is publicly available at https://github.com/clementinboittiaux/sucre.
翻译:水下图像会受到介质物理特性的影响,并且光线通过达到光学传感器之前会受到散射和波长依赖吸收的影响,这会导致所拍摄的颜色发生明显的变化。本文旨在恢复一幅场景的图像,使其看起来好像水对光传播没有任何影响。为此,我们引入了一种名为SUCRe的新方法,该方法利用场景的三维结构进行水下颜色恢复。通过跟踪多个图像中的点并在传感器到不同距离时跟踪它们的亮度,我们约束了水下图像形成模型中的参数优化,并检索未衰减的像素亮度。我们使用三个来源于真实场景和一个合成数据集的水下数据集,在各种环境下从自然光到深海环境进行了广泛的定量和定性分析。我们还将所提出方法的性能与各种现有最新方法进行了比较。结果表明,在一系列客观评估指标上,利用多个视角表现出一致的优势。我们的代码在https://github.com/clementinboittiaux/sucre上公开可用。