Neural architecture search (NAS) has shown great promise in the field of automated machine learning (AutoML). NAS has outperformed hand-designed networks and made a significant step forward in the field of automating the design of deep neural networks, thus further reducing the need for human expertise. However, most research is done targeting a single specific task, leaving research of NAS methods over multiple tasks mostly overlooked. Generally, there exist two popular ways to find an architecture for some novel task. Either searching from scratch, which is ineffective by design, or transferring discovered architectures from other tasks, which provides no performance guarantees and is probably not optimal. In this work, we present a meta-learning framework to warm-start Differentiable architecture search (DARTS). DARTS is a NAS method that can be initialized with a transferred architecture and is able to quickly adapt to new tasks. A task similarity measure is used to determine which transfer architecture is selected, as transfer architectures found on similar tasks will likely perform better. Additionally, we employ a simple meta-transfer architecture that was learned over multiple tasks. Experiments show that warm-started DARTS is able to find competitive performing architectures while reducing searching costs on average by 60%.
翻译:神经结构搜索(NAS)在自动化机器学习(Automal)领域显示了巨大的希望。NAS已经超越了手工设计的网络,在设计深神经网络自动化领域迈出了一大步,从而进一步减少了对人的专门知识的需求。然而,大多数研究都是针对一个单一的具体任务进行的,使对神经系统方法的研究在多项任务中大多被忽视。一般而言,有两种常见的方法可以寻找某种新任务的架构。要么从零开始搜索,这种搜索因设计而无效,要么从其他任务中转移发现的结构,这种结构没有提供性能保障,而且可能不是最佳的。在这项工作中,我们提出了一个元学习框架,用于暖动启动可区分的结构搜索(DARTS)。DARTS是一种NAS方法,可以与一个已转移的结构初始化,并能够迅速适应新的任务。任务类比度尺度用来确定选择哪些转移结构,因为类似任务中发现的转移结构可能效果更好。此外,我们采用一个简单的元转移结构,在多个任务中学习了如何学习。实验显示,在通过平均成本进行搜索的同时,取暖启动的DARS能够进行竞争。