Differentiable architecture search has gradually become the mainstream research topic in the field of Neural Architecture Search (NAS) for its capability to improve efficiency compared with the early NAS (EA-based, RL-based) methods. Recent differentiable NAS also aims at further improving search efficiency, reducing the GPU-memory consumption, and addressing the "depth gap" issue. However, these methods are no longer capable of tackling the non-differentiable objectives, let alone multi-objectives, e.g., performance, robustness, efficiency, and other metrics. We propose an end-to-end architecture search framework towards non-differentiable objectives, TND-NAS, with the merits of the high efficiency in differentiable NAS framework and the compatibility among non-differentiable metrics in Multi-objective NAS (MNAS). Under differentiable NAS framework, with the continuous relaxation of the search space, TND-NAS has the architecture parameters ($\alpha$) been optimized in discrete space, while resorting to the search policy of progressively shrinking the supernetwork by $\alpha$. Our representative experiment takes two objectives (Parameters, Accuracy) as an example, we achieve a series of high-performance compact architectures on CIFAR10 (1.09M/3.3%, 2.4M/2.95%, 9.57M/2.54%) and CIFAR100 (2.46M/18.3%, 5.46/16.73%, 12.88/15.20%) datasets. Favorably, under real-world scenarios (resource-constrained, platform-specialized), the Pareto-optimal solutions can be conveniently reached by TND-NAS.
翻译:在神经结构搜索(NAS)领域,差异建筑搜索逐渐成为主流研究主题。相对于早期NAS(EA-基础、RL-基础)方法,差异建筑搜索(NAS)领域提高了效率。最近不同的NAS还旨在进一步提高搜索效率,减少GPU-模量消费,并解决“深度差距”问题。然而,这些方法不再能够解决非差异性目标,更不用说多目标,例如性能、稳健性、效率和其他指标。我们提议了一个端到端的建筑搜索框架,目的是实现无差异的NAS(EA-基础、RLLA-基础)方法。与早期NAS(EA-基础、RL)相比,效率更高。在不同的NAS框架下,TND-NAS具备了结构参数(美元),在离散空间中优化了结构参数(美元),TND-NAS,在TAFAL-FAR-M(1.009美元)下逐步缩小超级网络的搜索政策,在5美元/平面值平台上实现了两个目标。我们具有代表性的模型,在ATRAFARM-rodealal 10-rodeal-rodeal asional exal a exional a ex a ex ex a ex ex ex ex exal a ex ex ex exal exal ex ex ex ex a ex a ex ex ex a ex ex exal ex ex ex ex ex ex exal ex exal exal agleglegleglemental a exal aglection.