Neural Architecture Search (NAS) aims to automatically find effective architectures from a predefined search space. However, the search space is often extremely large. As a result, directly searching in such a large search space is non-trivial and also very time-consuming. To address the above issues, in each search step, we seek to limit the search space to a small but effective subspace to boost both the search performance and search efficiency. To this end, we propose a novel Neural Architecture Search method via Automatic Subspace Evoking (ASE-NAS) that finds promising architectures in automatically evoked subspaces. Specifically, we first perform a global search, i.e., automatic subspace evoking, to evoke/find a good subspace from a set of candidates. Then, we perform a local search within the evoked subspace to find an effective architecture. More critically, we further boost search performance by taking well-designed/searched architectures as the initial candidate subspaces. Extensive experiments show that our ASE-NAS not only greatly reduces the search cost but also finds better architectures than state-of-the-art methods in various benchmark search spaces.
翻译:神经结构搜索(NAS) 旨在从预定义的搜索空间中自动找到有效的建筑。 然而, 搜索空间往往非常大。 因此, 在如此大的搜索空间中直接搜索是非边际的, 而且非常耗时。 为了解决上述问题, 在每一个搜索步骤中, 我们试图将搜索空间限制在一个小型但有效的子空间, 以提高搜索性能和搜索效率。 为此, 我们提议了一种新型的神经结构搜索方法, 通过自动子空间引用( ASS-NAS) 来找到在自动引用的子空间中找到有希望的建筑。 具体地说, 我们首先进行全球搜索, 即自动的子空间触发, 从一组候选人中调出/ 找到一个好的子空间。 然后, 我们在所设定的子空间中进行本地搜索, 以找到一个有效的结构。 更重要的是, 我们进一步通过将设计完善/ 搜索的架构作为初始候选子空间来提升搜索性能。 广泛的实验显示, 我们的 ASE-NAS不仅大大降低搜索成本, 而且还发现比各种基准空间中的状态方法更好的结构。