Neural network pruning has remarkable performance for reducing the complexity of deep network models. Recent network pruning methods usually focused on removing unimportant or redundant filters in the network. In this paper, by exploring the similarities between feature maps, we propose a novel filter pruning method, Central Filter (CF), which suggests that a filter is approximately equal to a set of other filters after appropriate adjustments. Our method is based on the discovery that the average similarity between feature maps changes very little, regardless of the number of input images. Based on this finding, we establish similarity graphs on feature maps and calculate the closeness centrality of each node to select the Central Filter. Moreover, we design a method to directly adjust weights in the next layer corresponding to the Central Filter, effectively minimizing the error caused by pruning. Through experiments on various benchmark networks and datasets, CF yields state-of-the-art performance. For example, with ResNet-56, CF reduces approximately 39.7% of FLOPs by removing 47.1% of the parameters, with even 0.33% accuracy improvement on CIFAR-10. With GoogLeNet, CF reduces approximately 63.2% of FLOPs by removing 55.6% of the parameters, with only a small loss of 0.35% in top-1 accuracy on CIFAR-10. With ResNet-50, CF reduces approximately 47.9% of FLOPs by removing 36.9% of the parameters, with only a small loss of 1.07% in top-1 accuracy on ImageNet. The codes can be available at https://github.com/8ubpshLR23/Central-Filter.
翻译:神经网络运行在降低深网络模型复杂性方面表现显著。 最近的网络运行方法通常侧重于删除网络中不重要或多余的过滤器。 在本文中, 我们通过探索地貌图之间的相似之处, 提议了一种新型过滤处理方法, 中央过滤器( CFF), 这表明过滤器在适当调整后大致等于一套其他过滤器。 我们的方法基于以下发现: 地貌地图之间的平均相似性变化非常小, 不论输入图像的数量多少。 基于这一发现, 我们在地貌地图上建立相似性图, 并计算每个节点在选择中央图像过滤器时的近距离中心位置。 此外, 我们设计了一种方法, 直接调整与中央过滤器相对应的下层的重量, 有效地尽量减少因鼠标运行造成的错误。 通过对各种基准网络和数据集的实验, 域域网能产生最先进的性性能。 例如, ResNet-56, CFFFO 减少约39. 1% 参数的约39. 03.1%, 在 CFFFAR. 03. 10 的精度中, 将CFAO. 的顶部L. 降低损失为65.