In the recent past, neural architecture search (NAS) has attracted increasing attention from both academia and industries. Despite the steady stream of impressive empirical results, most existing NAS algorithms are computationally prohibitive to execute due to the costly iterations of stochastic gradient descent (SGD) training. In this work, we propose an effective alternative, dubbed Random-Weight Evaluation (RWE), to rapidly estimate the performance of network architectures. By just training the last linear classification layer, RWE reduces the computational cost of evaluating an architecture from hours to seconds. When integrated within an evolutionary multi-objective algorithm, RWE obtains a set of efficient architectures with state-of-the-art performance on CIFAR-10 with less than two hours' searching on a single GPU card. Ablation studies on rank-order correlations and transfer learning experiments to ImageNet have further validated the effectiveness of RWE.
翻译:最近,神经结构搜索(NAS)吸引了学术界和产业界越来越多的关注。尽管不断流出令人印象深刻的经验性结果,但大多数现有的NAS算法在计算上令人望而却步,因为随机梯度下降(SGD)培训费用昂贵,因此无法执行。在这项工作中,我们建议了一种有效的替代方法,即所谓的随机光学评估(RWE),以快速估计网络结构的性能。只是通过培训最后的线性分类层,RWE将评估建筑的计算成本从数小时到秒降低。当纳入进化多目标算法时,REE在CIFAR-10上获得了一套具有最新性能的高效结构,在搜索单一的GPU卡时不到两小时。关于按级定级关系和将学习实验转移到图像网络的研究进一步证实了RWE的有效性。