Channel pruning has demonstrated its effectiveness in compressing ConvNets. In many prior arts, the importance of an output feature map is only determined by its associated filter. However, these methods ignore a small part of weights in the next layer which disappears as the feature map is removed. They ignore the dependency of the weights. In addition, many pruning methods use only one criterion for evaluation, and find a sweet-spot of pruning structure and accuracy in a trial-and-error fashion, which can be time-consuming. To address the above issues, we proposed a channel pruning algorithm via multi-criteria based on weight dependency, CPMC, which can compress a variety of models efficiently. We design the importance of the feature map in three aspects, including its associated weight value, computational cost, and parameter quantity. Use the phenomenon of weight dependency, We get the importance by assessing its associated filter and the corresponding partial weights of the next layer. Then we use global normalization to achieve cross-layer comparison. Our method can compress various CNN models, including VGGNet, ResNet, and DenseNet, on various image classification datasets. Extensive experiments have shown CPMC outperforms the others significantly.
翻译:频道运行显示了其在压缩 ConvNets 中的有效性。 在很多先前的艺术中, 输出特征映射的重要性只由相关过滤器决定。 然而, 这些方法忽略了下一层中随着特征映射的消失而消失的一小部分重量。 它们忽略了加权的依赖性。 此外, 许多运行方法只使用一个评估标准, 并找到一个甜点的剪切结构和精确度, 它可以耗费时间。 为了解决上述问题, 我们建议了一个基于重依赖性的多标准、 CPMC 的频道运行算法, 它可以有效压缩各种模型。 我们设计了特征映射的三个方面的重要性, 包括相关的重量值、 计算成本和参数数量。 使用重依赖性现象, 我们通过评估其相关过滤器和下一个层的相应部分重量来获得重要性。 然后我们用全球正统化来进行跨层比较。 我们的方法可以压缩各种CNN模型, 包括 VGGNet、 ResNet 和 DenseperNet 等, 显示各种图像分类数据。