Differentiable ARchiTecture Search (DARTS) is one of the most trending Neural Architecture Search (NAS) methods. It drastically reduces search cost by resorting to weight-sharing. However, it also dramatically reduces the search space, thus excluding potential promising architectures. In this article, we propose D-DARTS, a solution that addresses this problem by nesting neural networks at the cell level instead of using weight-sharing to produce more diversified and specialized architectures. Moreover, we introduce a novel algorithm that can derive deeper architectures from a few trained cells, increasing performance and saving computation time. In addition, we also present an alternative search space (DARTOpti) in which we optimize existing handcrafted architectures (e.g., ResNet) rather than starting from scratch. This approach is accompanied by a novel metric that measures the distance between architectures inside our custom search space. Our solution reaches competitive performance on multiple computer vision tasks. Code and pretrained models can be accessed at https://github.com/aheuillet/D-DARTS.
翻译:可区别的ARKTES搜索(DARTS)是最有趋势的神经结构搜索(NAS)方法之一,它通过使用权重共享极大地降低了搜索成本。然而,它也极大地减少了搜索空间,从而排除了潜在的有希望的建筑。在本篇文章中,我们提出D-DARTS,这是一个解决这一问题的办法,办法是在细胞一级嵌入神经网络,而不是使用权重共享来生产更加多样化和专业的建筑。此外,我们引入了一种新的算法,它可以从少数受过训练的细胞中获取更深层次的建筑,增加性能和节省计算时间。此外,我们还提出了一个替代的搜索空间(DARTOpti),让我们优化现有的手工艺型建筑(例如ResNet),而不是从零开始。这个方法配有一种新的衡量标准,用以测量我们自定义搜索空间内的建筑之间的距离。我们的解决方案在多种计算机视觉任务上达到了竞争性的表现。可以在 https://github.com/aheuillet/D-DARTS。