Single Image Super-Resolution (SISR) tasks have achieved significant performance with deep neural networks. However, the large number of parameters in CNN-based met-hods for SISR tasks require heavy computations. Although several efficient SISR models have been recently proposed, most are handcrafted and thus lack flexibility. In this work, we propose a novel differentiable Neural Architecture Search (NAS) approach on both the cell-level and network-level to search for lightweight SISR models. Specifically, the cell-level search space is designed based on an information distillation mechanism, focusing on the combinations of lightweight operations and aiming to build a more lightweight and accurate SR structure. The network-level search space is designed to consider the feature connections among the cells and aims to find which information flow benefits the cell most to boost the performance. Unlike the existing Reinforcement Learning (RL) or Evolutionary Algorithm (EA) based NAS methods for SISR tasks, our search pipeline is fully differentiable, and the lightweight SISR models can be efficiently searched on both the cell-level and network-level jointly on a single GPU. Experiments show that our methods can achieve state-of-the-art performance on the benchmark datasets in terms of PSNR, SSIM, and model complexity with merely 68G Multi-Adds for $\times 2$ and 18G Multi-Adds for $\times 4$ SR tasks.
翻译:单一图像超级分辨率(SISSR)任务在深层神经网络中取得了显著的成绩,然而,有线电视新闻网(CNN)为SISSR任务配备的大量超热量成型参数需要大量计算。虽然最近提出了若干高效的SISSR模型,但大多数是手工制作的,因此缺乏灵活性。在这项工作中,我们提议在细胞层面和网络层面采用新颖的可区分神经结构搜索(NAS)方法,以搜索较轻的SISRS模型。具体地说,细胞级搜索空间的设计基于信息蒸馏机制,侧重于轻量操作的组合,目的是建立一个更轻量和准确的SR结构。网络级搜索空间的目的是考虑各细胞之间的特征连接,并寻找哪些信息流动最有利于提高性能的细胞。与现有的基于NASS任务的强化学习(RL)或进化ALGthm(EA)方法不同,我们的搜索管道是完全不同的,而多量的SSR模型可以高效地在细胞级和网络一级进行搜索,在18美元/AddRISM标准值标准值的18美元标准值标准值标准值中,在单一G标准标准标准中可以实现我们18-IMS标准标准标准标准标准的S-s。