Automating the research for the best neural network model is a task that has gained more and more relevance in the last few years. In this context, Neural Architecture Search (NAS) represents the most effective technique whose results rival the state of the art hand-crafted architectures. However, this approach requires a lot of computational capabilities as well as research time, which makes prohibitive its usage in many real-world scenarios. With its sequential model-based optimization strategy, Progressive Neural Architecture Search (PNAS) represents a possible step forward to face this resources issue. Despite the quality of the found network architectures, this technique is still limited in research time. A significant step in this direction has been done by Pareto-Optimal Progressive Neural Architecture Search (POPNAS), which expands PNAS with a time predictor to enable a trade-off between search time and accuracy, considering a multi-objective optimization problem. This paper proposes a new version of the Pareto-Optimal Progressive Neural Architecture Search, called POPNASv2. Our approach enhances its first version and improves its performance. We expanded the search space by adding new operators and improved the quality of both predictors to build more accurate Pareto fronts. Moreover, we introduced cell equivalence checks and enriched the search strategy with an adaptive greedy exploration step. Our efforts allow POPNASv2 to achieve PNAS-like performance with an average 4x factor search time speed-up.
翻译:对最佳神经网络模型的研究自动化是一项任务,在过去几年中已经越来越具有相关性。在这方面,神经结构搜索(NAS)代表了最有效的技术,其结果与手工艺建筑的状态相匹配。然而,这一方法需要大量的计算能力和研究时间,这使得它在许多现实世界的情景中无法被使用。随着其基于模型的相继优化战略,进步神经结构搜索(PNAAS)是面对这一资源问题的一个可能的前进步骤。尽管发现网络结构的质量很高,但这一技术在研究时间上仍然有限。Pareto-Opimal进步神经结构搜索(NAS)是朝着这一方向迈出的重要一步,其结果与手工艺工艺的工艺相匹配。Pareto-Opimal Construal Neal 建筑搜索(NAS NAS NAS ) 扩展了PNAS, 有了时间预测, 使搜索时间和精确的准确性能在多个目标优化的优化问题之间实现交易。本文件提出了一个新的版本Pareto-Opimal Intual imational Produstrational lax。我们通过新的搜索平台和升级的升级的测试实现了。