This paper addresses the efficiency challenge of Neural Architecture Search (NAS) by formulating the task as a ranking problem. Previous methods require numerous training examples to estimate the accurate performance of architectures, although the actual goal is to find the distinction between "good" and "bad" candidates. Here we do not resort to performance predictors. Instead, we propose a performance ranking method (RankNAS) via pairwise ranking. It enables efficient architecture search using much fewer training examples. Moreover, we develop an architecture selection method to prune the search space and concentrate on more promising candidates. Extensive experiments on machine translation and language modeling tasks show that RankNAS can design high-performance architectures while being orders of magnitude faster than state-of-the-art NAS systems.
翻译:本文将神经结构搜索(NAS)的效率挑战作为一个排名问题来论述。 以往的方法需要无数的培训实例来估计建筑的准确性能, 尽管实际目标是找到“ 好”和“坏”候选人的区别。 我们这里不使用性能预测器。 相反, 我们建议一种通过对等排序的性能排序方法( RankNAS ) 。 它可以使用少得多的培训实例来进行高效的建筑搜索。 此外, 我们开发了一种结构选择方法来利用搜索空间, 并专注于更有前途的候选人。 有关机器翻译和语言建模任务的广泛实验显示, RankNAS 能够设计高性能的建筑, 而其规模比最先进的NAS 系统要快。