This survey focuses on categorizing and evaluating the methods of supernet optimization in the field of Neural Architecture Search (NAS). Supernet optimization involves training a single, over-parameterized network that encompasses the search space of all possible network architectures. The survey analyses supernet optimization methods based on their approaches to spatial and temporal optimization. Spatial optimization relates to optimizing the architecture and parameters of the supernet and its subnets, while temporal optimization deals with improving the efficiency of selecting architectures from the supernet. The benefits, limitations, and potential applications of these methods in various tasks and settings, including transferability, domain generalization, and Transformer models, are also discussed.
翻译:本次调查的重点是对神经结构搜索领域的超级网络优化方法进行分类和评估。超级网络优化涉及培训一个单一的、超参数化的网络,其中包括所有可能的网络架构的搜索空间。该调查根据对空间和时间优化的方法分析超级网络优化方法。空间优化涉及优化超级网络及其子网的架构和参数,而时间优化则涉及提高从超级网络中选择架构的效率。还讨论了这些方法在各种任务和环境中的益处、局限性和潜在应用,包括可转移性、广域化和变换模型。