Neural architecture search (NAS) is gaining more and more attention in recent years due to its flexibility and remarkable capability to reduce the burden of neural network design. To achieve better performance, however, the searching process usually costs massive computations that might not be affordable for researchers and practitioners. While recent attempts have employed ensemble learning methods to mitigate the enormous computational cost, however, they neglect a key property of ensemble methods, namely diversity, which leads to collecting more similar sub-architectures with potential redundancy in the final design. To tackle this problem, we propose a pruning method for NAS ensembles called "Sub-Architecture Ensemble Pruning in Neural Architecture Search (SAEP)." It targets to leverage diversity and to achieve sub-ensemble architectures at a smaller size with comparable performance to ensemble architectures that are not pruned. Three possible solutions are proposed to decide which sub-architectures to prune during the searching process. Experimental results exhibit the effectiveness of the proposed method by largely reducing the number of sub-architectures without degrading the performance.
翻译:近年来,由于神经结构搜索(NAS)具有灵活性和减少神经网络设计负担的非凡能力,近年来越来越受到越来越多的关注。然而,为了实现更好的绩效,搜索过程通常需要大量计算,而研究人员和从业人员可能负担不起。虽然最近尝试采用混合学习方法来降低巨大的计算成本,但是它们忽视了共同方法的关键属性,即多样性,这导致收集了在最终设计过程中可能冗余的更相似的子结构。为了解决这一问题,我们建议了一种称为“在神经结构搜索(SAEP)中装配辅助机械”的NAS组装方法。它的目标是利用多样性,并实现规模较小、与未调整的混合结构具有类似性能的子集成结构。提出了三种可能的解决办法,以决定在搜索过程中哪些次级结构在潜在冗余。实验结果展示了拟议方法的有效性,在不减损性能的情况下大量减少子结构的数量。