To improve the search efficiency for Neural Architecture Search (NAS), One-shot NAS proposes to train a single super-net to approximate the performance of proposal architectures during search via weight-sharing. While this greatly reduces the computation cost, due to approximation error, the performance prediction by a single super-net is less accurate than training each proposal architecture from scratch, leading to search inefficiency. In this work, we propose few-shot NAS that explores the choice of using multiple super-nets: each super-net is pre-trained to be in charge of a sub-region of the search space. This reduces the prediction error of each super-net. Moreover, training these super-nets can be done jointly via sequential fine-tuning. A natural choice of sub-region is to follow the splitting of search space in NAS. We empirically evaluate our approach on three different tasks in NAS-Bench-201. Extensive results have demonstrated that few-shot NAS, using only 5 super-nets, significantly improves performance of many search methods with slight increase of search time. The architectures found by DARTs and ENAS with few-shot models achieved 88.53% and 86.50% test accuracy on CIFAR-10 in NAS-Bench-201, significantly outperformed their one-shot counterparts (with 54.30% and 54.30% test accuracy). Moreover, on AUTOGAN and DARTS, few-shot NAS also outperforms previously state-of-the-art models.
翻译:为提高神经结构搜索(NAS)的搜索效率,一发NAS提出在通过权重共享搜索中训练一个单一的超级网,以近似建议结构的性能。这大大降低了计算成本,因为近似错误,单一超级网的性能预测比从头到尾对每个建议结构的培训要差,导致效率低下。在这项工作中,我们建议少数的NAS探索使用多个超级网的选择:每个超级网都经过预先训练,负责搜索空间的分区。这减少了每个超级网的预测误差。此外,这些超级网可以通过顺序微调来联合培训。一个单一超级网的性能预测比从头到头对每个建议结构的培训要差,导致效率低下。我们在NAS-Bench-201的三项不同任务中进行了实证评估。 在搜索时间稍长的时间内,只有5个超级网能够大大改进许多搜索方法的性能。 DARTS和 ENAS-20AS的架构可以通过顺序调整,在AAR-50号的精确度模型中,在ARC-SA-50级模型中,在AHR-IM-IM-IMFS-IB-I 上,在A-I-I-SA-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I