Although there have been significant advances in the field of image restoration recently, the system complexity of the state-of-the-art (SOTA) methods is increasing as well, which may hinder the convenient analysis and comparison of methods. In this paper, we propose a simple baseline that exceeds the SOTA methods and is computationally efficient. To further simplify the baseline, we reveal that the nonlinear activation functions, e.g. Sigmoid, ReLU, GELU, Softmax, etc. are not necessary: they could be replaced by multiplication or removed. Thus, we derive a Nonlinear Activation Free Network, namely NAFNet, from the baseline. SOTA results are achieved on various challenging benchmarks, e.g. 33.69 dB PSNR on GoPro (for image deblurring), exceeding the previous SOTA 0.38 dB with only 8.4% of its computational costs; 40.30 dB PSNR on SIDD (for image denoising), exceeding the previous SOTA 0.28 dB with less than half of its computational costs. The code and the pre-trained models are released at https://github.com/megvii-research/NAFNet.
翻译:尽管在图像恢复领域最近取得了重大进展,但最新工艺方法的系统复杂性也正在增加,这可能会妨碍对方法进行方便的分析与比较。在本文件中,我们提出了一个简单的基线,超过了SOTA方法,而且具有计算效率。为了进一步简化基线,我们发现非线性激活功能,例如Sigmoid、ReLU、GELU、Softmax等,并不必要:它们可以用乘法取代或删除。因此,我们从基线中得出一个非线性激活自由网络,即NAFNet,可能妨碍对方法进行方便的分析和比较。在各种具有挑战性的基准上取得了SOTA(用于图像分流),即33.69 dB PSNR(用于图像分流),超过以前的SOTA 0.38 dB,而其计算成本只有8.4%;SID的40.30 dB PSNR(用于图像分解),超过以前的SOTA/28 dB, 其计算成本不到一半。SOTA/Frembream/s.