Neural Architecture Search (NAS) has become a de facto approach in the recent trend of AutoML to design deep neural networks (DNNs). Efficient or near-zero-cost NAS proxies are further proposed to address the demanding computational issues of NAS, where each candidate architecture network only requires one iteration of backpropagation. The values obtained from the proxies are considered the predictions of architecture performance on downstream tasks. However, two significant drawbacks hinder the extended usage of Efficient NAS proxies. (1) Efficient proxies are not adaptive to various search spaces. (2) Efficient proxies are not extensible to multi-modality downstream tasks. Based on the observations, we design a Extensible proxy (Eproxy) that utilizes self-supervised, few-shot training (i.e., 10 iterations of backpropagation) which yields near-zero costs. The key component that makes Eproxy efficient is an untrainable convolution layer termed barrier layer that add the non-linearities to the optimization spaces so that the Eproxy can discriminate the performance of architectures in the early stage. Furthermore, to make Eproxy adaptive to different downstream tasks/search spaces, we propose a Discrete Proxy Search (DPS) to find the optimized training settings for Eproxy with only handful of benchmarked architectures on the target tasks. Our extensive experiments confirm the effectiveness of both Eproxy and Eproxy+DPS. Code is available at https://github.com/leeyeehoo/GenNAS-Zero.
翻译:神经架构搜索(NAS)已成为Automil 设计深神经网络(DNNS)的最新趋势中的一种事实上的做法。 进一步提议高效或接近零成本的NAS代理来解决NAS的苛刻计算问题, 每一个候选建筑网络只要求一次反演。 从代理获得的值被视为对下游任务的架构绩效的预测。 然而, 两个重大缺陷阻碍了高效NAS代理的扩大使用。 (1) 高效的代理无法适应各种搜索空间。 (2) 高效的代理无法取代多模式性能下游任务。 根据观察,我们设计了一个可扩展的代理(Eproprox),使用自我超动的、微量的训练(即10次反演算),产生近于零成本。 使得高效的NAS(高效的代理)代理, 将非线性空间添加到多模式空间, 使Eprropreral-DP(Eproprecoal) 的运行结构在EPRARCL 中形成。