Differentiable neural architecture search (DARTS) has gained much success in discovering flexible and diverse cell types. To reduce the evaluation gap, the supernet is expected to have identical layers with the target network. However, even for this consistent search, the searched cells often suffer from poor performance, especially for the supernet with fewer layers, as current DARTS methods are prone to wide and shallow cells, and this topology collapse induces sub-optimal searched cells. In this paper, we alleviate this issue by endowing the cells with explicit stretchability, so the search can be directly implemented on our stretchable cells for both operation type and topology simultaneously. Concretely, we introduce a set of topological variables and a combinatorial probabilistic distribution to explicitly model the target topology. With more diverse and complex topologies, our method adapts well for various layer numbers. Extensive experiments on CIFAR-10 and ImageNet show that our stretchable cells obtain better performance with fewer layers and parameters. For example, our method can improve DARTS by 0.28\% accuracy on CIFAR-10 dataset with 45\% parameters reduced or 2.9\% with similar FLOPs on ImageNet dataset.
翻译:不同的神经结构搜索( DARTS) 在发现灵活和多样化的细胞类型方面取得了很大成功。 为了缩小评估差距, 超级网预计将与目标网络具有相同的层层。 但是, 即使是这种一致的搜索, 搜索的细胞也往往表现不佳, 特别是对于层数较少的超级网来说, 因为目前的 DARSS 方法很容易被广泛和浅的细胞所利用, 而这种表层的崩溃会诱发亚优的搜索细胞。 在本文中, 我们通过给细胞以清晰的伸展性来缓解这一问题, 这样可以同时在可伸缩的细胞上直接进行操作类型和地形学的搜索。 具体地说, 我们引入了一组地形变量和组合性概率分布, 以明确模拟目标的表层数。 在更加多样和复杂的地形学中, 我们的方法适应了各种层数。 在 CIFAR- 10 和图像网上进行的广泛实验表明, 我们的可伸展性细胞的性会以较少的层次和参数获得更好的效果。 例如, 我们的方法可以使 CARFAR- 10 10 数据集 的精确度增加0. 0. 28 的精确度, 有45 或2. 9 。 FOPLS 。