In the past few years, neural architecture search (NAS) has become an increasingly important tool within the deep learning community. Despite the many recent successes of NAS, however, most existing approaches operate within highly structured design spaces, and hence explore only a small fraction of the full search space of neural architectures while also requiring significant manual effort from domain experts. In this work, we develop techniques that enable efficient NAS in a significantly larger design space. To accomplish this, we propose to perform NAS in an abstract search space of program properties. Our key insights are as follows: (1) the abstract search space is significantly smaller than the original search space, and (2) architectures with similar program properties also have similar performance; thus, we can search more efficiently in the abstract search space. To enable this approach, we also propose a novel efficient synthesis procedure, which accepts a set of promising program properties, and returns a satisfying neural architecture. We implement our approach, $\alpha$NAS, within an evolutionary framework, where the mutations are guided by the program properties. Starting with a ResNet-34 model, $\alpha$NAS produces a model with slightly improved accuracy on CIFAR-10 but 96% fewer parameters. On ImageNet, $\alpha$NAS is able to improve over Vision Transformer (30% fewer FLOPS and parameters), ResNet-50 (23% fewer FLOPS, 14% fewer parameters), and EfficientNet (7% fewer FLOPS and parameters) without any degradation in accuracy.
翻译:在过去几年里,神经结构搜索(NAS)已成为深层学习界中一个日益重要的工具。尽管NAS最近取得了许多成功,但大多数现有方法都在结构化设计空间内运作,因此只能探索神经结构完整搜索空间的一小部分,同时需要领域专家人工做出大量努力。在这项工作中,我们开发技术,使高效的NAS能够在一个大得多的设计空间中实现。为了实现这一点,我们提议在程序属性的抽象搜索空间中进行NAS。我们的主要见解如下:(1)抽象搜索空间大大小于最初的搜索空间,(2)具有类似程序属性的建筑也具有类似的性能;因此,我们可以在抽象搜索空间中更有效率地搜索。为了实现这一方法,我们还提出了一个新的高效合成程序,接受一套有希望的方案属性,并返回一个令人满意的神经结构。我们实施了我们的方法,即$alpha$NAS,在进化框架内,在进化环境中,突变异特性指导的更少。从ResNet-34模型开始,$\pha NAS 以美元 $NAS&ROS 在IMFFFFFFFAR-10的精确度参数上略地提高了一个模型(7%)。