Efficient evaluation of a network architecture drawn from a large search space remains a key challenge in Neural Architecture Search (NAS). Vanilla NAS evaluates each architecture by training from scratch, which gives the true performance but is extremely time-consuming. Recently, one-shot NAS substantially reduces the computation cost by training only one supernetwork, a.k.a. supernet, to approximate the performance of every architecture in the search space via weight-sharing. However, the performance estimation can be very inaccurate due to the co-adaption among operations. In this paper, we propose few-shot NAS that uses multiple supernetworks, called sub-supernet, each covering different regions of the search space to alleviate the undesired co-adaption. Since each subsupernet only covers a small search space, compared to one-shot NAS, few-shot NAS improves the accuracy of architecture evaluation with a small increase of evaluation cost. With only up to 7 sub-supernets, few-shot NAS establishes new SoTAs: on ImageNet, it finds models that reach 80.5 top-1 at 600 MB FLOPS and 77.3 top-1 at 230 MFLOPS; on CIFAR10, it reaches 98.72 top-1 without using extra data or transfer learning. In Auto-GAN, few-shot NAS outperforms the previously published results by up to 20\%. Extensive experiments show that few-shot NAS significantly improves various one-shot methods, including 4 gradient-based and 6 search-based methods on 3 different tasks in NASBench-201 and NASBench one-shot-one.
翻译:从大型搜索空间抽取的网络架构的有效评估仍然是神经结构搜索(NAS)中的一项关键挑战。 Vanilla NAS从零开始通过培训对每个架构进行评估,这能提供真正的性能,但耗时极多。最近,一发NAS通过培训一个超级网络(a.k.a.supernet),大大降低了计算成本,以通过权重共享来估计搜索空间中每个架构的性能。然而,由于各业务之间的调试,业绩估计可能非常不准确。在本文件中,我们建议使用多个超级网络(称为子超级网)对每个架构进行评估,每个区域都覆盖搜索空间的不同区域,以缓解不理想的共性能。最近,一发NAS通过培训一个超级网络(a.k.a.supernet)只涵盖一个小的搜索空间(a.enet),从而通过权重分享权重度来提高每个搜索空间的准确性能。然而,由于只有7个小网,因此,几乎无法使用新的SASSAS(Sotas)建立新的 SotaNet:在图像网上,它发现模型在600个顶级至1级的搜索任务上达到80.5,在FLOP1和771和773级的S-I-S-S-S-S-S-S-S-S-S-S-I-I-S-AFAFAS-AFAFAFAFS-S-S-S-S-S-S-S-AFAS-S-I-I-I-I-SBSB 上显示一个顶级的顶级的顶级的顶级的顶级的顶级,在20的顶级上,在前学习中,在20-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I