Recent advances in single image super-resolution (SISR) have achieved extraordinary performance, but the computational cost is too heavy to apply in edge devices. To alleviate this problem, many novel and effective solutions have been proposed. Convolutional neural network (CNN) with the attention mechanism has attracted increasing attention due to its efficiency and effectiveness. However, there is still redundancy in the convolution operation. In this paper, we propose Blueprint Separable Residual Network (BSRN) containing two efficient designs. One is the usage of blueprint separable convolution (BSConv), which takes place of the redundant convolution operation. The other is to enhance the model ability by introducing more effective attention modules. The experimental results show that BSRN achieves state-of-the-art performance among existing efficient SR methods. Moreover, a smaller variant of our model BSRN-S won the first place in model complexity track of NTIRE 2022 Efficient SR Challenge. The code is available at https://github.com/xiaom233/BSRN.
翻译:在单一图像超分辨率(SISSR)方面最近的进展取得了非凡的成绩,但计算成本太高,无法应用于边缘装置。为了缓解这一问题,提出了许多新颖而有效的解决办法。具有关注机制的进化神经网络因其效率和有效性而引起越来越多的关注。然而,在演化行动中仍然存在着冗余。在本文件中,我们提出了包含两种有效设计的蓝图Sparable残余网络(BSRN),其中一个是使用蓝图可分离的共变(BSConv),这是多余的共变作业的发生。另一个是采用更有效的关注模块来增强模型能力。实验结果显示,BSRN在现有高效的SR方法中取得了最先进的性能。此外,我们的BSRNS-S模型在NTIRE 2022 高效SR挑战的模型复杂轨道中获得了第一个较小的变式。该代码可在https://github.com/xiaoum233/BSRN上查阅。