A myriad of recent breakthroughs in hand-crafted neural architectures for visual recognition have highlighted the urgent need to explore hybrid architectures consisting of diversified building blocks. Meanwhile, neural architecture search methods are surging with an expectation to reduce human efforts. However, whether NAS methods can efficiently and effectively handle diversified search spaces with disparate candidates (e.g. CNNs and transformers) is still an open question. In this work, we present Block-wisely Self-supervised Neural Architecture Search (BossNAS), an unsupervised NAS method that addresses the problem of inaccurate architecture rating caused by large weight-sharing space and biased supervision in previous methods. More specifically, we factorize the search space into blocks and utilize a novel self-supervised training scheme, named ensemble bootstrapping, to train each block separately before searching them as a whole towards the population center. Additionally, we present HyTra search space, a fabric-like hybrid CNN-transformer search space with searchable down-sampling positions. On this challenging search space, our searched model, BossNet-T, achieves up to 82.5% accuracy on ImageNet, surpassing EfficientNet by 2.4% with comparable compute time. Moreover, our method achieves superior architecture rating accuracy with 0.78 and 0.76 Spearman correlation on the canonical MBConv search space with ImageNet and on NATS-Bench size search space with CIFAR-100, respectively, surpassing state-of-the-art NAS methods. Code: https://github.com/changlin31/BossNAS
翻译:手工艺神经结构(如CNNs和变压器)最近出现的大量突破表明,迫切需要探索由多样化建筑块组成的混合结构。与此同时,神经结构搜索方法正在急剧上升,希望减少人类的努力。然而,NAS方法能否高效和有效地处理多样化搜索空间,而不同的候选人(如CNNs和变压器)则仍然是一个未决问题。在这项工作中,我们展示了一种不受监督的不受监督的NAS方法,它解决了由大型权重共享空间和以往方法中偏差监督造成的不准确结构评级问题。更具体地说,我们将搜索空间纳入区块中,并使用新型的自我监督培训计划,称为通则在将每个区作为整体搜索到人口中心之前对其进行单独培训。此外,我们展示了HyTra搜索空间,一个像结构般的CNNC- Transticle搜索空间网络(BossNet-T)-NASmargyal 搜索空间空间空间空间空间,我们搜索模型、BossNet-Net-NART-NAS-S, 以高压的Speal性Seral性S 方法,在图像网络上达到82.5的SIMLILS,在图像结构上可比较的SIMF8的搜索中,可以实现SIMF8的SIMLBS-S-S-S-S-S-S。