One-shot Neural Architecture Search (NAS) aims to minimize the computational expense of discovering state-of-the-art models. However, in the past year attention has been drawn to the comparable performance of naive random search across the same search spaces used by leading NAS algorithms. To address this, we explore the effects of drastically relaxing the NAS search space, and we present Bonsai-Net, an efficient one-shot NAS method to explore our relaxed search space. Bonsai-Net is built around a modified differential pruner and can consistently discover state-of-the-art architectures that are significantly better than random search with fewer parameters than other state-of-the-art methods. Additionally, Bonsai-Net performs simultaneous model search and training, dramatically reducing the total time it takes to generate fully-trained models from scratch.
翻译:一次性的神经结构搜索(NAS)旨在将发现最新模型的计算成本降至最低。 但是,去年,人们注意到了在主要NAS算法使用的相同搜索空间进行天真随机搜索的类似性能。 为了解决这个问题,我们探索了大幅放松NAS搜索空间的影响,我们介绍了邦赛网,这是探索我们放松搜索空间的高效一次性搜索方法。 Bonsai-Net是围绕一个经过修改的差分运行器建造的,可以不断发现比随机搜索好得多,其参数比其他最新方法少得多。此外,Bonsai-Net还同时进行模型搜索和培训,大大减少了从零开始生成经过全面培训的模型所需的全部时间。