Differentiable architecture search is prevalent in the field of NAS because of its simplicity and efficiency, where two paradigms, multi-path algorithms and single-path methods, are dominated. Multi-path framework (e.g. DARTS) is intuitive but suffers from memory usage and training collapse. Single-path methods (e.g.GDAS and ProxylessNAS) mitigate the memory issue and shrink the gap between searching and evaluation but sacrifice the performance. In this paper, we propose a conceptually simple yet efficient method to bridge these two paradigms, referred as Mutually-aware Sub-Graphs Differentiable Architecture Search (MSG-DAS). The core of our framework is a differentiable Gumbel-TopK sampler that produces multiple mutually exclusive single-path sub-graphs. To alleviate the severer skip-connect issue brought by multiple sub-graphs setting, we propose a Dropblock-Identity module to stabilize the optimization. To make best use of the available models (super-net and sub-graphs), we introduce a memory-efficient super-net guidance distillation to improve training. The proposed framework strikes a balance between flexible memory usage and searching quality. We demonstrate the effectiveness of our methods on ImageNet and CIFAR10, where the searched models show a comparable performance as the most recent approaches.
翻译:在NAS领域,差异化建筑搜索十分普遍,因为其简单而有效,以两种范式、多路径算法和单一路径建筑搜索为主。多路径框架(如DARTS)是直观的,但受记忆使用和培训崩溃的影响。单一路径方法(如GDAS和ProxylessNAS)减轻记忆问题,缩小搜索和评价之间的差距,牺牲业绩。在本文件中,我们提出了一个概念简单而有效的方法,以弥合这两个范式,称为“相互认识的子格拉夫斯差异建筑搜索”(MSG-DAS),我们框架的核心是不同的Gumbel-TopK取样器,它产生多个相互排斥的单一路径子图。为了缓解多子绘图设置带来的严重跳过问题,我们建议了一个“滴块识别”模块,以稳定最佳的性能。为了最佳利用现有模式(超级网和子图),我们采用了一个记忆高效的超级网络指导,对最新图像进行最精确的模拟。我们提议的框架展示了搜索的性能模型。我们用一个搜索模型,用来改进了最新的图像质量。