The design of handcrafted neural networks requires a lot of time and resources. Recent techniques in Neural Architecture Search (NAS) have proven to be competitive or better than traditional handcrafted design, although they require domain knowledge and have generally used limited search spaces. In this paper, we propose a novel framework for neural architecture search, utilizing a dictionary of models of base tasks and the similarity between the target task and the atoms of the dictionary; hence, generating an adaptive search space based on the base models of the dictionary. By introducing a gradient-based search algorithm, we can evaluate and discover the best architecture in the search space without fully training the networks. The experimental results show the efficacy of our proposed task-aware approach.
翻译:手工艺神经网络的设计需要大量的时间和资源。神经结构搜索(NAS)的最新技术已经证明比传统的手工设计设计更具有竞争力或更好,尽管它们需要领域知识并普遍使用有限的搜索空间。在本文中,我们提出了一个神经结构搜索的新框架,使用基本任务模型的字典以及目标任务与字典原子的相似性;因此,根据字典的基础模型创造一个适应性搜索空间。通过采用基于梯度的搜索算法,我们可以评估并发现搜索空间中的最佳结构,而不对网络进行全面培训。实验结果显示了我们所提议的任务认知方法的有效性。