There has been a large literature of neural architecture search, but most existing work made use of heuristic rules that largely constrained the search flexibility. In this paper, we first relax these manually designed constraints and enlarge the search space to contain more than $10^{160}$ candidates. In the new space, most existing differentiable search methods can fail dramatically. We then propose a novel algorithm named Gradual One-Level Differentiable Neural Architecture Search (GOLD-NAS) which introduces a variable resource constraint to one-level optimization so that the weak operators are gradually pruned out from the super-network. In standard image classification benchmarks, GOLD-NAS can find a series of Pareto-optimal architectures within a single search procedure. Most of the discovered architectures were never studied before, yet they achieve a nice tradeoff between recognition accuracy and model complexity. We believe the new space and search algorithm can advance the search of differentiable NAS.
翻译:有大量神经结构搜索文献, 但大部分现有工作都使用了主要限制搜索灵活性的超自然规则。 在本文中, 我们首先放松这些手工设计的制约, 扩大搜索空间, 以包含超过 10 ⁇ 160 美元的候选人。 在新空间中, 大部分现有的差异搜索方法可能大打折扣。 然后我们提出了一个名为“ 渐进单级差异神经结构搜索” (GOLD- NAS) 的新型算法, 给一级优化引入了可变资源限制, 以便弱操作员逐渐从超级网络中分离出来。 在标准图像分类基准中, GOLD- NAS 可以在单一的搜索程序中找到一系列最佳结构。 大多数发现的结构以前从未进行过研究, 但是在识别精确度和模型复杂度之间实现了一个良好的平衡。 我们相信, 新的空间和搜索算法可以推进不同NAS 的搜索 。