In this work, we present a simple and general search space shrinking method, called Angle-Based search space Shrinking (ABS), for Neural Architecture Search (NAS). Our approach progressively simplifies the original search space by dropping unpromising candidates, thus can reduce difficulties for existing NAS methods to find superior architectures. In particular, we propose an angle-based metric to guide the shrinking process. We provide comprehensive evidences showing that, in weight-sharing supernet, the proposed metric is more stable and accurate than accuracy-based and magnitude-based metrics to predict the capability of child models. We also show that the angle-based metric can converge fast while training supernet, enabling us to get promising shrunk search spaces efficiently. ABS can easily apply to most of NAS approaches (e.g. SPOS, FairNAS, ProxylessNAS, DARTS and PDARTS). Comprehensive experiments show that ABS can dramatically enhance existing NAS approaches by providing a promising shrunk search space.
翻译:在这项工作中,我们提出了一个简单和一般的搜索空间缩小方法,称为“星基搜索空间缩小”(ABS),用于神经结构搜索(NAS)。我们的方法通过丢弃没有希望的候选人逐步简化最初的搜索空间,从而可以减少现有NAS寻找高级建筑的方法的困难。特别是,我们提出了一个以角度为基础的衡量标准来指导缩小过程。我们提供了全面的证据,表明在权重共享超级网中,拟议的衡量标准比基于精确和量度的衡量标准更稳定、更准确,以预测儿童模型的能力。我们还表明,在培训超级网络的同时,基于角度的衡量标准可以快速趋同,使我们能够有效地获得充满希望的缩小搜索空间。ABS可以很容易地应用于大多数NAS方法(例如SPOS、FairNAS、ProxylessNAS、DARTS和PARTS)。全面实验显示,ABS能够通过提供有希望的Shrunk搜索空间,大大增强现有的NAS方法。