Neural Architecture Search (NAS) algorithms are intended to remove the burden of manual neural network design, and have shown to be capable of designing excellent models for a variety of well-known problems. However, these algorithms require a variety of design parameters in the form of user configuration or hard-coded decisions which limit the variety of networks that can be discovered. This means that NAS algorithms do not eliminate model design tuning, they instead merely shift the burden of where that tuning needs to be applied. In this paper, we present SpiderNet, a hybrid differentiable-evolutionary and hardware-aware algorithm that rapidly and efficiently produces state-of-the-art networks. More importantly, SpiderNet is a proof-of-concept of a minimally-configured NAS algorithm; the majority of design choices seen in other algorithms are incorporated into SpiderNet's dynamically-evolving search space, minimizing the number of user choices to just two: reduction cell count and initial channel count. SpiderNet produces models highly-competitive with the state-of-the-art, and outperforms random search in accuracy, runtime, memory size, and parameter count.
翻译:神经结构搜索(NAS)算法意在消除人工神经网络设计的负担,并表明能够设计出各种众所周知的问题的杰出模型。然而,这些算法要求以用户配置或硬编码决定的形式提供各种设计参数,限制可以发现的网络的多样性。这意味着NAS算法并不消除模型设计调整,而只是将调整需要应用的时间转移。在本文中,我们介绍了蜘蛛网,这是一种能迅速和高效地产生最新网络的多样化革命和硬件敏化混合算法。更重要的是,蜘蛛网是一个最起码结构化的NAS算法的证明概念;其他算法中看到的大多数设计选择被纳入蜘蛛网动态动态动态的搜索空间,将用户选择的数量减少到仅两个:减少细胞计数和初始频道计数。蜘蛛网制作了与最新工艺高度竞争性的模型,以及准确性、运行时、记忆大小、参数和数等随机搜索的外形模型。