Neural architecture search (NAS) has been an active direction of automatic machine learning (Auto-ML), aiming to explore efficient network structures. The searched architecture is evaluated by training on datasets with fixed data augmentation policies. However, recent works on auto-augmentation show that the suited augmentation policies can vary over different structures. Therefore, this work considers the possible coupling between neural architectures and data augmentation and proposes an effective algorithm jointly searching for them. Specifically, 1) for the NAS task, we adopt a single-path based differentiable method with Gumbel-softmax reparameterization strategy due to its memory efficiency; 2) for the auto-augmentation task, we introduce a novel search method based on policy gradient algorithm, which can significantly reduce the computation complexity. Our approach achieves 97.91% accuracy on CIFAR-10 and 76.6% Top-1 accuracy on ImageNet dataset, showing the outstanding performance of our search algorithm.
翻译:神经结构搜索(NAS)一直是自动机器学习(自动-ML)的积极方向,目的是探索高效的网络结构。搜索结构是通过使用固定数据增强政策进行数据集培训来评估的。然而,最近的自动增强工程表明,适合的增强政策可以因不同的结构而不同。因此,这项工作考虑了神经结构与数据增强之间的可能结合,并提出了一种有效的算法,以共同搜索这些结构。具体地说,对于NAS任务,我们采用了一种基于单一路径的、基于不同路径的方法,与古姆贝尔-软模x的重新计量战略由于记忆效率而具有差异性;对于自动增强任务,我们采用了一种基于政策梯度算法的新搜索方法,这可以大大降低计算的复杂性。我们的方法在CIFAR-10和图像网络数据集上实现了97.91%的精度和76.6%的顶端-1精度,显示了我们的搜索算法的出色表现。