Differentiable architecture search (DARTS) has significantly promoted the development of NAS techniques because of its high search efficiency and effectiveness but suffers from performance collapse. In this paper, we make efforts to alleviate the performance collapse problem for DARTS from two aspects. First, we investigate the expressive power of the supernet in DARTS and then derive a new setup of DARTS paradigm with only training BatchNorm. Second, we theoretically find that random features dilute the auxiliary connection role of skip-connection in supernet optimization and enable search algorithm focus on fairer operation selection, thereby solving the performance collapse problem. We instantiate DARTS and PC-DARTS with random features to build an improved version for each named RF-DARTS and RF-PCDARTS respectively. Experimental results show that RF-DARTS obtains \textbf{94.36\%} test accuracy on CIFAR-10 (which is the nearest optimal result in NAS-Bench-201), and achieves the newest state-of-the-art top-1 test error of \textbf{24.0\%} on ImageNet when transferring from CIFAR-10. Moreover, RF-DARTS performs robustly across three datasets (CIFAR-10, CIFAR-100, and SVHN) and four search spaces (S1-S4). Besides, RF-PCDARTS achieves even better results on ImageNet, that is, \textbf{23.9\%} top-1 and \textbf{7.1\%} top-5 test error, surpassing representative methods like single-path, training-free, and partial-channel paradigms directly searched on ImageNet.
翻译:不同的建筑搜索( DARTS) 因其高搜索效率和效能,大大促进了NAS 技术的发展。 在本文中, 我们努力从两个方面缓解 DARTS 的性能崩溃问题。 首先, 我们调查DARTS 中超级网的显示力, 然后在培训BatchNorm 的情况下推出一个新的 DARTS 范式。 第二, 我们理论上发现随机性能淡化了超级网络优化中的跳接连接的辅助连接作用, 并使得搜索算法侧重于更公平的操作选择, 从而解决性能崩溃问题。 我们即时化 DARTS 和 PC- DARTS 随机地为 DARTS 分别建立一个改进的版本。 实验结果显示, RF- DARS 获得了 CARFAR 10 (这是最接近的NAS- Bench- Bench 201 最优化的结果), 并且实现了最新的一级一级测试错误, 一级S- DARS 1, 上一级S 3级S- RFS 的测试结果。