Neural architecture search (NAS) is gaining more and more attention in recent years due to its flexibility and the remarkable capability of reducing the burden of neural network design. To achieve better performance, however, the searching process usually costs massive computation, which might not be affordable to researchers and practitioners. While recent attempts have employed ensemble learning methods to mitigate the enormous computation, an essential characteristic of diversity in ensemble methods is missed out, causing more similar sub-architectures to be gathered and potential redundancy in the final ensemble architecture. To bridge this gap, we propose a pruning method for NAS ensembles, named as ''Sub-Architecture Ensemble Pruning in Neural Architecture Search (SAEP).'' It targets to utilize diversity and achieve sub-ensemble architectures in a smaller size with comparable performance to the unpruned ensemble architectures. Three possible solutions are proposed to decide which subarchitectures should be pruned during the searching process. Experimental results demonstrate the effectiveness of the proposed method in largely reducing the size of ensemble architectures while maintaining the final performance. Moreover, distinct deeper architectures could be discovered if the searched sub-architectures are not diverse enough.
翻译:近些年来,由于灵活性和减少神经网络设计负担的非凡能力,神经结构搜索(NAS)近年来越来越受到越来越多的关注。然而,为了实现更好的业绩,搜索过程通常需要大量计算,而研究人员和从业人员可能无法负担得起。虽然最近尝试采用混合学习方法来减缓巨大的计算,但错失了混合方法多样性的基本特征,导致在最后合谋结构中收集更相似的子结构,并可能出现冗余。为了缩小这一差距,我们建议了NAS 组合的调整方法,称为“神经结构搜索中的子建筑合成(SAEP)”。它的目标是利用多样性,实现小尺寸的子集成结构,其性能与未经调整的混合结构的类似。提出了三种可能的解决办法,以决定搜索过程中应收集哪些次结构,以及最后合体结构中可能出现更相似。实验结果表明,拟议的方法在大幅缩小设计型结构的大小方面的有效性,称为“神经结构搜索中的亚集集成(SAEPEP)。“SEP”) 目标是利用多样性并实现小尺寸,而最后的搜索结构可能不够清晰。