Neural architecture search (NAS) has fostered various fields of machine learning. Despite its prominent dedications, many have criticized the intrinsic limitations of high computational cost. We aim to ameliorate this by proposing a pretraining scheme that can be generally applied to controller-based NAS. Our method, locality-based self-supervised classification task, leverages the structural similarity of network architectures to obtain good architecture representations. We incorporate our method into neural architecture optimization (NAO) to analyze the pretrained embeddings and its effectiveness and highlight that adding metric learning loss brings a favorable impact on NAS. Our code is available at \url{https://github.com/Multi-Objective-NAS/self-supervised-nas}.
翻译:神经结构搜索(NAS)促进了机器学习的各个领域。尽管它具有突出的奉献精神,但许多人批评了高计算成本的内在局限性。我们的目标是通过提出一个可以普遍适用于基于控制系统的NAS的训练前计划来改善这一点。我们的方法,基于地点的自我监督分类任务,利用网络结构的结构相似性来获得良好的结构代表。我们将我们的方法纳入神经结构优化(NAO)中,以分析预先培训的嵌入及其有效性,并着重指出增加标准学习损失对NAS产生有利的影响。我们的代码可以在\url{https://github.com/Multi-Objective-NAS/self-urvevised-nas}查阅。