Neural architecture search (NAS) promises to make deep learning accessible to non-experts by automating architecture engineering of deep neural networks. BANANAS is one state-of-the-art NAS method that is embedded within the Bayesian optimization framework. Recent experimental findings have demonstrated the strong performance of BANANAS on the NAS-Bench-101 benchmark being determined by its path encoding and not its choice of surrogate model. We present experimental results suggesting that the performance of BANANAS on the NAS-Bench-301 benchmark is determined by its acquisition function optimizer, which minimally mutates the incumbent.
翻译:通过对深神经网络的建筑工程进行自动化,神经建筑搜索(NAS)有望使非专家能够深入学习。BANANAS是包含在巴耶斯优化框架之内的一种最先进的NAS方法。最近的实验结果表明,ANANAS在NAS-Bench-101基准上的出色表现是由其路径编码而不是其替代模型的选择所决定的。我们提出了实验结果,表明BANANAS在NAS-Bench-301基准上的绩效是由其购置功能优化决定的,该优化最小地改变了现任者。