Differential Architecture Search (DARTS) is now a widely disseminated weight-sharing neural architecture search method. However, there are two fundamental weaknesses remain untackled. First, we observe that the well-known aggregation of skip connections during optimization is caused by an unfair advantage in an exclusive competition. Second, there is a non-negligible incongruence when discretizing continuous architectural weights to a one-hot representation. Because of these two reasons, DARTS delivers a biased solution that might not even be suboptimal. In this paper, we present a novel approach to curing both frailties. Specifically, as unfair advantages in a pure exclusive competition easily induce a monopoly, we relax the choice of operations to be collaborative, where we let each operation have an equal opportunity to develop its strength. We thus call our method Fair DARTS. Moreover, we propose a zero-one loss to directly reduce the discretization gap. Experiments are performed on two mainstream search spaces, in which we achieve new state-of-the-art networks on ImageNet. Our code is available on https://github.com/xiaomi-automl/fairdarts.
翻译:差异建筑搜索(DARTS)目前是一种广为传播的共享重力神经结构搜索方法,然而,仍有两个基本弱点尚未解决。首先,我们观察到,在优化过程中众所周知的跳过连接总合是由于独家竞争中的不公平优势造成的。第二,在将连续建筑重量分解为一体代表时,存在着一种不可忽略的不一致。由于这两个原因,DARTS提供了一种甚至可能不尽人意的偏向解决方案。在本文中,我们提出了一种新颖的治疗两种弱点的方法。具体地说,由于纯独家竞争中的不公平优势很容易引发垄断,我们放松了行动选择,让每个行动都有平等的机会来发展其实力。我们称之为“公平DARTRS”(Fair DARTRS),因此我们建议了一种零损失来直接缩小离散差距。在两个主流搜索空间进行实验,我们在其中我们在图像网络上实现新的状态-艺术网络。我们的代码可以在https://github.com/ximaom-autoom-fairarts上查阅。