Snow removal aims to locate snow areas and recover clean images without repairing traces. Unlike the regularity and semitransparency of rain, snow with various patterns and degradations seriously occludes the background. As a result, the state-of-the-art snow removal methods usually retains a large parameter size. In this paper, we propose a lightweight but high-efficient snow removal network called Laplace Mask Query Transformer (LMQFormer). Firstly, we present a Laplace-VQVAE to generate a coarse mask as prior knowledge of snow. Instead of using the mask in dataset, we aim at reducing both the information entropy of snow and the computational cost of recovery. Secondly, we design a Mask Query Transformer (MQFormer) to remove snow with the coarse mask, where we use two parallel encoders and a hybrid decoder to learn extensive snow features under lightweight requirements. Thirdly, we develop a Duplicated Mask Query Attention (DMQA) that converts the coarse mask into a specific number of queries, which constraint the attention areas of MQFormer with reduced parameters. Experimental results in popular datasets have demonstrated the efficiency of our proposed model, which achieves the state-of-the-art snow removal quality with significantly reduced parameters and the lowest running time.
翻译:雪花去除旨在定位雪花区域并恢复不修复痕迹的清洁图像。与雨水的规律性和半透明性不同,雪花具有各种图案和退化,严重遮挡了背景。因此,最先进的去雪方法通常保留大量的参数。在本文中,我们提出了一种轻量级但高效的去雪网络,称为Laplace Mask Query Transformer(LMQFormer)。首先,我们提出了一种Laplace-VQVAE生成粗略掩模作为雪的先验知识。我们旨在减少雪的信息熵和恢复的计算成本,而不是在数据集中使用掩模。其次,我们设计了Mask Query Transformer(MQFormer)来使用粗略的掩码去除雪,其中我们使用两个并行编码器和一个混合解码器,在轻量级要求下学习广泛的雪花特征。第三,我们开发了一个重复Mask Query Attention(DMQA),将粗略掩模转换为特定数量的查询,这些查询约束MQFormer的注意区域,并减少了参数。流行数据集上的实验结果证明了我们提出模型的效率,该模型实现了最先进的去雪质量,同时大大减少了参数和最低运行时间。