Underwater images are altered by the physical characteristics of the medium through which light rays pass before reaching the optical sensor. Scattering and strong wavelength-dependent absorption significantly modify the captured colors depending on the distance of observed elements to the image plane. In this paper, we aim to recover the original colors of the scene as if the water had no effect on them. We propose two novel methods that rely on different sets of inputs. The first assumes that pixel intensities in the restored image are normally distributed within each color channel, leading to an alternative optimization of the well-known \textit{Sea-thru} method which acts on single images and their distance maps. We additionally introduce SUCRe, a new method that further exploits the scene's 3D Structure for Underwater Color Restoration. By following points in multiple images and tracking their intensities at different distances to the sensor we constrain the optimization of the image formation model parameters. When compared to similar existing approaches, SUCRe provides clear improvements in a variety of scenarios ranging from natural light to deep-sea environments. The code for both approaches is publicly available at https://github.com/clementinboittiaux/sucre .
翻译:光线光线在光传感器到达之前经过的介质的物理特征改变了水下图像的物理特征。 散射和强烈波长吸收根据观测到元素对图像平面的距离而大大改变所捕捉的颜色。 在本文中,我们的目标是恢复原始场景的颜色,好像水对它没有影响一样。 我们提出两种依靠不同输入的新方法。 第一个假设是,恢复后的图像的像素强度通常在每个颜色通道内分布,导致对单个图像及其距离地图上运行的众所周知的\textit{Sea-thru}方法进行另一种优化。 我们还引入了SUCRe,这是进一步利用现场3D结构进行水下颜色恢复的新方法。通过多点图像并跟踪其在不同距离的强度,我们限制图像形成模型参数的优化。 与现有的类似方法相比, SUCRe在从自然光到深海环境的各种情景中提供了明确的改进。 这两种方法的代码在 https://github. comcorporti/centrientrition 上公开提供。