Recent advances in single image super-resolution (SISR) explored the power of convolutional neural network (CNN) to achieve a better performance. Despite the great success of CNN-based methods, it is not easy to apply these methods to edge devices due to the requirement of heavy computation. To solve this problem, various fast and lightweight CNN models have been proposed. The information distillation network is one of the state-of-the-art methods, which adopts the channel splitting operation to extract distilled features. However, it is not clear enough how this operation helps in the design of efficient SISR models. In this paper, we propose the feature distillation connection (FDC) that is functionally equivalent to the channel splitting operation while being more lightweight and flexible. Thanks to FDC, we can rethink the information multi-distillation network (IMDN) and propose a lightweight and accurate SISR model called residual feature distillation network (RFDN). RFDN uses multiple feature distillation connections to learn more discriminative feature representations. We also propose a shallow residual block (SRB) as the main building block of RFDN so that the network can benefit most from residual learning while still being lightweight enough. Extensive experimental results show that the proposed RFDN achieve a better trade-off against the state-of-the-art methods in terms of performance and model complexity. Moreover, we propose an enhanced RFDN (E-RFDN) and won the first place in the AIM 2020 efficient super-resolution challenge. Code will be available at https://github.com/njulj/RFDN.
翻译:在单一图像超分辨率(SISR)的最近进步中,单一图像超分辨率(SISR)探索了革命性神经网络(CNN)的影响力,以取得更好的性能。尽管CNN使用的方法取得了巨大成功,但由于需要大量计算,将这些方法应用于边缘装置并非易事。为了解决这个问题,提出了各种快速和轻量的CNN模型。信息蒸馏网络是采用频道分解操作以提取蒸馏特性的最先进方法之一。然而,这一操作如何帮助设计高效的SISR模型并不够清楚。在本文中,我们提议将功能蒸馏连接(FDC)在功能上等同于频道分解操作,同时更轻和灵活。感谢FDC,我们可以重新思考信息多蒸馏网络(IM),并提出一个光量和准确的SISSR模型,即利用频道分解操作提取功能模型(RD)的多重特性蒸馏连接,以了解更具歧视性的SISM模型。我们还提议在RM-RFD(R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-R-S-S-R-R-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-I-S-S-L-L-L-S-S-I-I-I-I-I-I-I-S-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-ID-ID-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-