Differentiable ARchiTecture Search (DARTS) is one of the most trending Neural Architecture Search (NAS) methods. It drastically reduces search cost by resorting to weight-sharing. However, it also dramatically reduces the search space, thus excluding potential promising architectures. In this article, we propose D-DARTS, a solution that addresses this problem by nesting neural networks at the cell level instead of using weight-sharing to produce more diversified and specialized architectures. Moreover, we introduce a novel algorithm that can derive deeper architectures from a few trained cells, increasing performance and saving computation time. In addition, we also present an alternative search space (DARTOpti) in which we optimize existing handcrafted architectures (e.g., ResNet) rather than starting from scratch. This approach is accompanied by a novel metric that measures the distance between architectures inside our custom search space. Our solution reaches competitive performance on multiple computer vision tasks.
翻译:可区别的ARCHTITETS(DARTS) 是最具趋势的神经结构搜索(NAS)方法之一。 它通过使用权重共享,极大地降低了搜索成本。 但是, 它也极大地减少了搜索空间, 从而排除了潜在的有希望的建筑。 在本条中, 我们提出D-DARTS, 一种解决这一问题的办法, 办法是在细胞一级嵌入神经网络, 而不是使用权重共享来生成更加多样化和专业化的建筑。 此外, 我们引入了一种新颖的算法, 它可以从少数受过训练的细胞中获取更深层次的建筑, 增加性能和节省计算时间。 此外, 我们还展示了另一种搜索空间( DARTOPti), 我们优化了现有手工艺建筑( 例如 ResNet ), 而不是从零开始。 这种方法伴随着一种新颖的衡量标准, 以测量我们自定义搜索空间内建筑之间的距离。 我们的解决方案可以在多种计算机视觉任务上实现竞争性的绩效 。