Neural architecture search (NAS) aims to discover network architectures with desired properties such as high accuracy or low latency. Recently, differentiable NAS (DNAS) has demonstrated promising results while maintaining a search cost orders of magnitude lower than reinforcement learning (RL) based NAS. However, DNAS models can only optimize differentiable loss functions in search, and they require an accurate differentiable approximation of non-differentiable criteria. In this work, we present UNAS, a unified framework for NAS, that encapsulates recent DNAS and RL-based approaches under one framework. Our framework brings the best of both worlds, and it enables us to search for architectures with both differentiable and non-differentiable criteria in one unified framework while maintaining a low search cost. Further, we introduce a new objective function for search based on the generalization gap that prevents the selection of architectures prone to overfitting. We present extensive experiments on the CIFAR-10, CIFAR-100, and ImageNet datasets and we perform search in two fundamentally different search spaces. We show that UNAS obtains the state-of-the-art average accuracy on all three datasets when compared to the architectures searched in the DARTS space. Moreover, we show that UNAS can find an efficient and accurate architecture in the ProxylessNAS search space, that outperforms existing MobileNetV2 based architectures. The source code is available at https://github.com/NVlabs/unas .
翻译:神经结构搜索(NAS) 旨在发现具有高精度或低延度等理想特性的网络架构。 最近, 不同的NAS(DNAS) 展示了有希望的成果, 同时保持了比强化学习(RL) 低的搜索成本级别。 然而, DNAS 模型只能优化搜索中不同的损失功能, 并且需要准确的、 可区别的、 不可区别的标准的近似近似值。 在这项工作中, 我们展示了UNAS(NAS), 这是NAS的一个统一框架, 将最近的DNAS 和基于RL( RL) 的方法包含在一个框架之下。 我们的框架带来了两个最好的世界, 它使我们能够在一个统一的框架内以不同和不可区别的标准搜索建筑。 然而, DNAS 模型只能在一个统一的框架内以不同和不可区别的标准搜索。 此外, 我们引入了一个新的客观的搜索功能, 依据普遍化差距进行搜索, 从而无法选择容易过度适应的 。 我们在 CIRFAR- 10 、 CIFAR- 100 和图像网络数据集集中进行广泛的实验, 我们在两个根本不同的搜索空间空间空间空间空间空间空间空间空间中进行搜索。 我们展示了在搜索中获取了标准中, 搜索中, 搜索中, 搜索中可以比联合国空间- sal- s- s