Recently, Neural Architecture Search (NAS) methods have been introduced and show impressive performance on many benchmarks. Among those NAS studies, Neural Architecture Transformer (NAT) aims to adapt the given neural architecture to improve performance while maintaining computational costs. However, NAT lacks reproducibility and it requires an additional architecture adaptation process before network weight training. In this paper, we propose proxyless neural architecture adaptation that is reproducible and efficient. Our method can be applied to both supervised learning and self-supervised learning. The proposed method shows stable performance on various architectures. Extensive reproducibility experiments on two datasets, i.e., CIFAR-10 and Tiny Imagenet, present that the proposed method definitely outperforms NAT and is applicable to other models and datasets.
翻译:最近,引入了神经结构搜索(NAS)方法,并在许多基准上表现出令人印象深刻的成绩,其中神经结构变异器(NAT)旨在调整特定神经结构,以提高性能,同时保持计算成本,然而,NAT缺乏再复制能力,需要在网络重量培训之前增加一个结构调整过程。在本文中,我们提议无代理神经结构的改造,这是可复制的和有效的。我们的方法可以适用于监督学习和自我监督的学习。拟议方法显示各种结构的稳定性能。关于两个数据集(即CIFAR-10和Tinyy图像网)的广泛再生实验表明,拟议的方法肯定优于NAT,并适用于其他模型和数据集。