In the past decade, advances in deep learning have resulted in breakthroughs in a variety of areas, including computer vision, natural language understanding, speech recognition, and reinforcement learning. Specialized, high-performing neural architectures are crucial to the success of deep learning in these areas. Neural architecture search (NAS), the process of automating the design of neural architectures for a given task, is an inevitable next step in automating machine learning and has already outpaced the best human-designed architectures on many tasks. In the past few years, research in NAS has been progressing rapidly, with over 1000 papers released since 2020 (Deng and Lindauer, 2021). In this survey, we provide an organized and comprehensive guide to neural architecture search. We give a taxonomy of search spaces, algorithms, and speedup techniques, and we discuss resources such as benchmarks, best practices, other surveys, and open-source libraries.
翻译:在过去十年中,深层学习的进展在各个领域取得了突破,包括计算机视野、自然语言理解、语音识别和强化学习。专业、高性能的神经结构对于这些领域深层学习的成功至关重要。神经结构搜索(NAS)是将神经结构设计自动化以完成特定任务的过程,是机械学习自动化的不可避免的下一个步骤,已经超过了许多任务的最佳人类设计结构。在过去几年中,NAS的研究进展迅速,自2020年以来发布了1,000多篇论文(Deng和Lindauer,2021年)。在这次调查中,我们为神经结构搜索提供了有组织和全面的指南。我们给出了搜索空间、算法和加速技术的分类,我们讨论了基准、最佳做法、其他调查以及开源图书馆等资源。