Single Image Super-Resolution (SISR) tasks have achieved significant performance with deep neural networks. However, the large number of parameters in CNN-based methods for SISR tasks require heavy computations. Although several efficient SISR models have been recently proposed, most are handcrafted and thus lack flexibility. In this work, we propose a novel differentiable Neural Architecture Search (NAS) approach on both the cell-level and network-level to search for lightweight SISR models. Specifically, the cell-level search space is designed based on an information distillation mechanism, focusing on the combinations of lightweight operations and aiming to build a more lightweight and accurate SR structure. The network-level search space is designed to consider the feature connections among the cells and aims to find which information flow benefits the cell most to boost the performance. Unlike the existing Reinforcement Learning (RL) or Evolutionary Algorithm (EA) based NAS methods for SISR tasks, our search pipeline is fully differentiable, and the lightweight SISR models can be efficiently searched on both the cell-level and network-level jointly on a single GPU. Experiments show that our methods can achieve state-of-the-art performance on the benchmark datasets in terms of PSNR, SSIM, and model complexity with merely 68G Multi-Adds for $\times 2$ and 18G Multi-Adds for $\times 4$ SR tasks. Code will be available at \url{https://github.com/DawnHH/DLSR-PyTorch}.
翻译:单图像超级分辨率(SISSR)任务在深层神经网络中取得了显著的性能,然而,基于CNN的SISSR任务方法的大量参数需要大量计算。虽然最近提出了几个高效的SISSR模型,但大多数都是手工制作的,因此缺乏灵活性。在这项工作中,我们提议在细胞层次和网络层次上采用新颖的不同神经结构搜索(NAS)方法,以搜索轻量级的SISSR模型。具体地说,细胞级搜索空间的设计基于信息蒸馏机制,侧重于轻量操作的组合,并旨在建立一个更轻量和准确的SR结构。网络级搜索空间的设计是为了考虑细胞之间的特征连接,目的是找到哪些信息流动最有利于细胞提高性能。与基于NAS任务的现有强化学习(RL)或进化阿尔高科技(EA)方法不同,我们的搜索管道是完全不同的,轻量级SISRS模型可以高效地在细胞层次和网络层次一级进行搜索,以单一的GPURS-D-D 4号标准,实验显示我们的方法可以在18个基准时间里达到S-del-al-al-al-al-al-al-al-laus-al-al-al-al-al-al-al-al-al-al-al-s-al-al-al-al-al-al-al-al-al-s-al-al-s-S) ex-S。