Existing neural architecture search (NAS) methods often return an architecture with good search performance but generalizes poorly to the test setting. To achieve better generalization, we propose a novel neighborhood-aware NAS formulation to identify flat-minima architectures in the search space, with the assumption that flat minima generalize better than sharp minima. The phrase "flat-minima architecture" refers to architectures whose performance is stable under small perturbations in the architecture (e.g., replacing a convolution with a skip connection). Our formulation takes the "flatness" of an architecture into account by aggregating the performance over the neighborhood of this architecture. We demonstrate a principled way to apply our formulation to existing search algorithms, including sampling-based algorithms and gradient-based algorithms. To facilitate the application to gradient-based algorithms, we also propose a differentiable representation for the neighborhood of architectures. Based on our formulation, we propose neighborhood-aware random search (NA-RS) and neighborhood-aware differentiable architecture search (NA-DARTS). Notably, by simply augmenting DARTS with our formulation, NA-DARTS finds architectures that perform better or on par with those found by state-of-the-art NAS methods on established benchmarks, including CIFAR-10, CIFAR-100 and ImageNet.
翻译:现有神经结构搜索(NAS) 方法通常会返回一个具有良好搜索性能的建筑,但一般化到测试环境。为了实现更好的概括化,我们建议采用一个新的邻里觉醒NAS配方,以确定搜索空间中的平米结构,假设平面微型模型比尖度小型模型更全面。“平面-minima结构”一语是指在建筑小扰动下性能稳定的建筑(例如,用跳接连接取代混凝土)。我们的配方将这一建筑的性能汇集在一起,从而将建筑的“膨胀性”考虑在内。我们展示了一种原则性的方法,将我们的配方应用于现有的搜索算法,包括基于抽样的算法和基于梯度的算法。为了便利对基于梯度的算法的应用,我们还提议了建筑群落的可不同代表。根据我们的配方,我们建议以邻里随机搜索(NA-RS)和邻里可辨识的建筑搜索(NA-DARS) 。 值得注意的是,简单地将DARS-100与我们所建的平面、NA-DARS 和NA-RA-RAS 所建的图像基准进行更好的状态的建筑。