Recently, much attention has been spent on neural architecture search (NAS), aiming to outperform those manually-designed neural architectures on high-level vision recognition tasks. Inspired by the success, here we attempt to leverage NAS techniques to automatically design efficient network architectures for low-level image restoration tasks. In particular, we propose a memory-efficient hierarchical NAS (termed HiNAS) and apply it to two such tasks: image denoising and image super-resolution. HiNAS adopts gradient based search strategies and builds a flexible hierarchical search space, including the inner search space and outer search space. They are in charge of designing cell architectures and deciding cell widths, respectively. For the inner search space, we propose a layer-wise architecture sharing strategy (LWAS), resulting in more flexible architectures and better performance. For the outer search space, we design a cell-sharing strategy to save memory, and considerably accelerate the search speed. The proposed HiNAS method is both memory and computation efficient. With a single GTX1080Ti GPU, it takes only about 1 hour for searching for denoising network on the BSD-500 dataset and 3.5 hours for searching for the super-resolution structure on the DIV2K dataset. Experiments show that the architectures found by HiNAS have fewer parameters and enjoy a faster inference speed, while achieving highly competitive performance compared with state-of-the-art methods. Code is available at: https://github.com/hkzhang91/HiNAS
翻译:最近,对神经结构搜索(NAS)进行了大量关注,目的是在高水平视觉识别任务上超越那些人工设计的神经结构。受成功启发,我们试图利用NAS技术自动设计高效网络结构以完成低水平图像恢复任务。特别是,我们提议一个记忆效率高的高级NAS(Med HINAS),并将其应用于两个这样的任务:图像脱色和图像超分辨率。HINAS采用基于梯度的搜索策略,并建立一个灵活的等级搜索空间,包括内部搜索空间和外部搜索空间。它们分别负责设计细胞结构和决定单元格宽度。对于内部搜索空间,我们提议一个基于层次的架构共享战略(LWAS),以自动设计高效的网络结构来完成低水平图像恢复任务。对于外搜索空间,我们设计了一个基于图像脱色的共享策略,以保存记忆,并大大加快搜索速度。拟议的HINASS方法既可以存储和计算效率。使用一个单一的GTX1080-Ti GPU,它们只需要大约1小时的时间来在BSD500/Slaimal Slaimal Settyal das 上搜索网络的解析标,同时进行快速数据,并显示高分辨率结构的运行。SD-deal-lax-lax-lax-lax-lax-lax-lax-lax-laxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx