Neural architecture search methods seek optimal candidates with efficient weight-sharing supernet training. However, recent studies indicate poor ranking consistency about the performance between stand-alone architectures and shared-weight networks. In this paper, we present Prior-Guided One-shot NAS (PGONAS) to strengthen the ranking correlation of supernets. Specifically, we first explore the effect of activation functions and propose a balanced sampling strategy based on the Sandwich Rule to alleviate weight coupling in the supernet. Then, FLOPs and Zen-Score are adopted to guide the training of supernet with ranking correlation loss. Our PGONAS ranks 3rd place in the supernet Track Track of CVPR2022 Second lightweight NAS challenge. Code is available in https://github.com/pprp/CVPR2022-NAS?competition-Track1-3th-solution.
翻译:然而,最近的研究表明,独立建筑和共享重量网络之间业绩的排名不一致。在本文件中,我们介绍了前引导一次性NAS(PGONAS),以加强超级网络的排名相关性。具体地说,我们首先探索激活功能的效果,并根据《桑威奇规则》提出一个平衡的抽样战略,以减轻超级网络中的重量组合。然后,FLOPs和Zen-Score被采用,以指导具有排名相关损失的超级网络的培训。我们的PGONAS在CVPR2022第二次轻量NAS挑战的超级网络轨道中排名第三。代码可在https://github.com/pprp/CVPR2022-NAS?Complication-Tracrack1-3th-Solution上查阅。