Simplicity is the ultimate sophistication. Differentiable Architecture Search (DARTS) has now become one of the mainstream paradigms of neural architecture search. However, it largely suffers from the well-known performance collapse issue due to the aggregation of skip connections. It is thought to have overly benefited from the residual structure which accelerates the information flow. To weaken this impact, we propose to inject unbiased random noise to impede the flow. We name this novel approach NoisyDARTS. In effect, a network optimizer should perceive this difficulty at each training step and refrain from overshooting, especially on skip connections. In the long run, since we add no bias to the gradient in terms of expectation, it is still likely to converge to the right solution area. We also prove that the injected noise plays a role in smoothing the loss landscape, which makes the optimization easier. Our method features extreme simplicity and acts as a new strong baseline. We perform extensive experiments across various search spaces, datasets, and tasks, where we robustly achieve state-of-the-art results. Our code is available at https://github.com/xiaomi-automl/NoisyDARTS.
翻译:简单化是终极的精密。 不同的建筑搜索( DARTS) 现已成为神经结构搜索的主流模式之一。 但是, 它在很大程度上受到众所周知的性能崩溃问题的影响, 这是因为连接的跳过总合。 人们认为它过度受益于加速信息流动的剩余结构。 为了削弱这一影响, 我们建议输入不偏颇的随机噪音来阻碍流动。 我们命名了这个新颖的“ NoisyDARTS ” 方法。 实际上, 网络优化者应该在每个培训步骤中都意识到这一困难, 并且避免过度拍摄, 特别是跳过连接。 从长远看, 由于我们在期望方面没有增加偏差, 它仍然有可能与正确的解决方案领域趋同。 我们还证明, 注入的噪音在平滑损失景观方面起着作用, 这使得优化更容易。 我们的方法非常简单, 并起到新的强大基准作用。 我们在各种搜索空间、 数据集和任务中进行广泛的实验, 我们在那里可以实现“ 状态” 。 我们的代码可以在 https://github.com/ exomia- automis/ Notoomal/No.