Neural Architecture Search (NAS) has shifted network design from using human intuition to leveraging search algorithms guided by evaluation metrics. We study channel size optimization in convolutional neural networks (CNN) and identify the role it plays in model accuracy and complexity. Current channel size selection methods are generally limited by discrete sample spaces while suffering from manual iteration and simple heuristics. To solve this, we introduce an efficient dynamic scaling algorithm -- CONet -- that automatically optimizes channel sizes across network layers for a given CNN. Two metrics -- ``\textit{Rank}" and "\textit{Rank Average Slope}" -- are introduced to identify the information accumulated in training. The algorithm dynamically scales channel sizes up or down over a fixed searching phase. We conduct experiments on CIFAR10/100 and ImageNet datasets and show that CONet can find efficient and accurate architectures searched in ResNet, DARTS, and DARTS+ spaces that outperform their baseline models.
翻译:神经结构搜索(NAS)已经将网络设计从使用人类直觉转向利用受评估指标指导的搜索算法。 我们研究进化神经网络中的频道尺寸优化,并确定其在模型准确性和复杂性方面所起的作用。 目前频道尺寸选择方法一般受离散样本空间的限制,同时受到人工迭代和简单杂交的影响。 为了解决这个问题,我们引入了高效的动态缩放算法 -- -- CONet -- -- 自动优化特定CNN的跨网络层的频道尺寸。 引入了两种度量法 -- -- ⁇ textit{Rank} 和\textit{Rank 平均曲线} -- -- 来识别培训中积累的信息。 算法动态缩放频道大小在固定搜索阶段上下。 我们在 CIFAR10100 和图像网络数据集上进行实验, 并显示CONet 能找到在ResNet、 DARTS 和 DARTS+ 空间中搜索的高效和准确的结构, 超越了基线模型。