Differential Neural Architecture Search (NAS) requires all layer choices to be held in memory simultaneously; this limits the size of both search space and final architecture. In contrast, Probabilistic NAS, such as PARSEC, learns a distribution over high-performing architectures, and uses only as much memory as needed to train a single model. Nevertheless, it needs to sample many architectures, making it computationally expensive for searching in an extensive space. To solve these problems, we propose a sampling method adaptive to the distribution entropy, drawing more samples to encourage explorations at the beginning, and reducing samples as learning proceeds. Furthermore, to search fast in the multi-variate space, we propose a coarse-to-fine strategy by using a factorized distribution at the beginning which can reduce the number of architecture parameters by over an order of magnitude. We call this method Fast Probabilistic NAS (FP-NAS). Compared with PARSEC, it can sample 64% fewer architectures and search 2.1x faster. Compared with FBNetV2, FP-NAS is 1.9x - 3.5x faster, and the searched models outperform FBNetV2 models on ImageNet. FP-NAS allows us to expand the giant FBNetV2 space to be wider (i.e. larger channel choices) and deeper (i.e. more blocks), while adding Split-Attention block and enabling the search over the number of splits. When searching a model of size 0.4G FLOPS, FP-NAS is 132x faster than EfficientNet, and the searched FP-NAS-L0 model outperforms EfficientNet-B0 by 0.7% accuracy. Without using any architecture surrogate or scaling tricks, we directly search large models up to 1.0G FLOPS. Our FP-NAS-L2 model with simple distillation outperforms BigNAS-XL with advanced in-place distillation by 0.7% accuracy using similar FLOPS.
翻译:不同的神经结构搜索(NAS) 需要同时保存所有层级选择, 以记忆方式保存所有层次的选择; 这限制了搜索空间和最终架构的大小。 相反, 概率化的NAS(如 PARSEC) 等概率化型NAS 可以在高性能建筑中学习分布, 并且只使用所需的记忆量来训练单一模型。 然而, 它需要抽样许多结构, 使得在广阔的空间中搜索它计算成本昂贵。 为了解决这些问题, 我们建议了一种适应分布器的采样方法, 抽取更多的样本, 鼓励在开始时进行探索, 并减少样本。 此外, 为了在多变空间空间中快速搜索, 我们建议了一种粗度化的NAS, 在开始时使用因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因因故而而而而导致而导致的图像参数减少而减少的图像值。 我们称“ 快速” NAS. 我们称快速的NAS. NAS. NAS. 我们称快速化为 F- 或F- 将F- 。 将 F- 将 F- 将F- 将F- smod- f- f- f- f- f- f- f- f- f- fre- f- f- fre- f- f- fld- fre- fre- fre- fre- fremod- fld- sld- s- lad- lad- lad- lad- lad- lad- lad- lad- s- lad- lad- lad- s- lad- s- lad- lad- lad- la- lad- lad- lad- lad- lad- lad- lad lad lad- lad- la- lad- lad- lad- lad- lad- lad- lad- lad- s- lad- s