The existing neural architecture search algorithms are mostly working on search spaces with short-distance connections. We argue that such designs, though safe and stable, obstacles the search algorithms from exploring more complicated scenarios. In this paper, we build the search algorithm upon a complicated search space with long-distance connections, and show that existing weight-sharing search algorithms mostly fail due to the existence of \textbf{interleaved connections}. Based on the observation, we present a simple yet effective algorithm named \textbf{IF-NAS}, where we perform a periodic sampling strategy to construct different sub-networks during the search procedure, avoiding the interleaved connections to emerge in any of them. In the proposed search space, IF-NAS outperform both random sampling and previous weight-sharing search algorithms by a significant margin. IF-NAS also generalizes to the micro cell-based spaces which are much easier. Our research emphasizes the importance of macro structure and we look forward to further efforts along this direction.
翻译:现有的神经结构搜索算法大多在短距离连接的搜索空间工作。 我们争辩说,这种设计虽然安全稳定,但却阻碍了搜索算法探索更复杂的情景。 在本文中,我们将搜索算法建在一个复杂的长距离连接的搜索空间上,并显示现有重力共享搜索算法大多由于存在\ textbf{间断连接而失败。 根据观察,我们提出了一个简单而有效的算法,名为\ textbf{IF-NAS},我们在那里执行一个定期抽样战略,在搜索过程中建立不同的子网络,避免在其中任何一个中出现断开的连接。在拟议的搜索空间,IF-NAS在随机抽样和先前的重力共享搜索算法上都大大超出一个边际。 IF-NAS还概括了以微型细胞为基础的空间,这些空间比较容易得多。 我们的研究强调了宏观结构的重要性,我们期待沿着这一方向进一步努力。