This paper focuses on filter-level network pruning. A novel pruning method, termed CLR-RNF, is proposed. We first reveal a "long-tail" long-tail pruning problem in magnitude-based weight pruning methods, and then propose a computation-aware measurement for individual weight importance, followed by a Cross-Layer Ranking (CLR) of weights to identify and remove the bottom-ranked weights. Consequently, the per-layer sparsity makes up of the pruned network structure in our filter pruning. Then, we introduce a recommendation-based filter selection scheme where each filter recommends a group of its closest filters. To pick the preserved filters from these recommended groups, we further devise a k-Reciprocal Nearest Filter (RNF) selection scheme where the selected filters fall into the intersection of these recommended groups. Both our pruned network structure and the filter selection are non-learning processes, which thus significantly reduce the pruning complexity, and differentiate our method from existing works. We conduct image classification on CIFAR-10 and ImageNet to demonstrate the superiority of our CLR-RNF over the state-of-the-arts. For example, on CIFAR-10, CLR-RNF removes 74.1% FLOPs and 95.0% parameters from VGGNet-16 with even 0.3\% accuracy improvements. On ImageNet, it removes 70.2% FLOPs and 64.8% parameters from ResNet-50 with only 1.7% top-5 accuracy drops. Our project is at https://github.com/lmbxmu/CLR-RNF.
翻译:本文侧重于过滤级别网络的运行。 因此, 我们提出了一个叫做 CLR- RNF 的新型过滤方法。 我们首先在基于星等的重量裁剪方法中显示一个“ 长尾尾” 长尾裁剪问题, 然后提出个人重量重要性的计算觉度测量, 之后是跨双层排行权重( CLR ), 以识别和删除最底层排位重量。 因此, 过滤器运行过程中, 每层的宽度构成了剪裁网络结构。 然后, 我们引入了一个基于建议的过滤器选择方案, 每个过滤器都推荐一组最接近的精度过滤器。 为了从这些推荐的组中选择保存的过滤器, 我们进一步设计了一个 kRanslourn Neest Ferfer( RNF) 。 我们的剪裁剪裁网络架构和过滤器是非学习过程, 这样可以大大降低剪裁的复杂程度, 并且区分我们的方法与现有的工程。 我们在 CFAR- 10 和图像 Net 进行图像分类, 以显示我们的 CR- RR- 10 的精度为 CR- R 。