Self-similarity refers to the image prior widely used in image restoration algorithms that small but similar patterns tend to occur at different locations and scales. However, recent advanced deep convolutional neural network based methods for image restoration do not take full advantage of self-similarities by relying on self-attention neural modules that only process information at the same scale. To solve this problem, we present a novel Pyramid Attention module for image restoration, which captures long-range feature correspondences from a multi-scale feature pyramid. Inspired by the fact that corruptions, such as noise or compression artifacts, drop drastically at coarser image scales, our attention module is designed to be able to borrow clean signals from their "clean" correspondences at the coarser levels. The proposed pyramid attention module is a generic building block that can be flexibly integrated into various neural architectures. Its effectiveness is validated through extensive experiments on multiple image restoration tasks: image denoising, demosaicing, compression artifact reduction, and super resolution. Without any bells and whistles, our PANet (pyramid attention module with simple network backbones) can produce state-of-the-art results with superior accuracy and visual quality. Our code will be available at https://github.com/SHI-Labs/Pyramid-Attention-Networks
翻译:自相类似是指在图像恢复算法中以前广泛使用的图像,即小型但相似的模式往往在不同地点和尺度上出现。然而,最近先进的深演神经网络图像恢复方法,不能通过依赖仅处理相同规模信息的自控神经模块,充分利用自我相似性。为了解决这个问题,我们提出了一个用于图像恢复的新型金字塔关注模块,该模块从多尺度特征金字塔中捕捉到长距离特征通信。受以下事实的启发:腐败,例如噪音或压缩文物,在粗糙图像尺度上急剧下降,我们的关注模块的设计是能够从粗糙层的“清洁”通信中借用清洁信号的。拟议的金字塔关注模块是一个通用的建筑块,可以灵活地融入各种神经结构。它的有效性通过对多个图像恢复任务的广泛实验得到验证:图像解析、解析、压缩制品减少和超级分辨率。如果没有任何钟声和哨,我们的 PANNet(带有简单网络主干网的平调模块)将生成高级/图像-Squal-Labs/mial-masabsal-mial-mamamamamamamamamas-mamamamamamamamamas-mabs-mamabs) ral romabsal- preal-cilal-deal-deal-al-mamabsmal-labsmal-maxal-Ial-Ial-Labsmal-Lbsal-masmal-masmal-maxal-mabdal-madal-maxal-maxal-maxal-al-Lbdal-maxal-maxal-maxal-maxal-maxal-maxal-maxal-Lbdal-Ial-modal-Lbdal-Lbdal-al-Ial-Ial-Ial-Lbdal-Ial-Ial-Ial-Ial-Ial-Ial-Ial-Ial-al-I-I-I-I-I-I-Id-I-I-I-I-I-I-Id-Ial-Ial-Ial-Ial-Lbdal-I-I