Neural architecture search (NAS) is a recent methodology for automating the design of neural network architectures. Differentiable neural architecture search (DARTS) is a promising NAS approach that dramatically increases search efficiency. However, it has been shown to suffer from performance collapse, where the search often leads to detrimental architectures. Many recent works try to address this issue of DARTS by identifying indicators for early stopping, regularising the search objective to reduce the dominance of some operations, or changing the parameterisation of the search problem. In this work, we hypothesise that performance collapses can arise from poor local optima around typical initial architectures and weights. We address this issue by developing a more global optimisation scheme that is able to better explore the space without changing the DARTS problem formulation. Our experiments show that our changes in the search algorithm allow the discovery of architectures with both better test performance and fewer parameters.
翻译:神经结构搜索 (NAS) 是神经网络结构设计自动化的最新方法。 不同的神经结构搜索( DARSS) 是很有希望的NAS 方法, 大大提高了搜索效率。 然而, 事实证明, 它因性能崩溃而受害, 搜索往往导致建筑受损。 许多近期工作试图通过确定早期停止指标、 调整搜索目标以减少某些操作的主导性, 或改变搜索问题的参数化来解决 DARSS 问题。 在这项工作中, 我们假设, 性能崩溃可能来自典型的初始建筑和重量的本地偏好。 我们通过开发一个更全球性的优化计划来解决这一问题, 该计划能够更好地探索空间,而不会改变 DARSS 问题的设计。 我们的实验表明, 我们搜索算法的变化使得发现建筑的测试性能更好,参数更少。