The goal of filter pruning is to search for unimportant filters to remove in order to make convolutional neural networks (CNNs) efficient without sacrificing the performance in the process. The challenge lies in finding information that can help determine how important or relevant each filter is with respect to the final output of neural networks. In this work, we share our observation that the batch normalization (BN) parameters of pre-trained CNNs can be used to estimate the feature distribution of activation outputs, without processing of training data. Upon observation, we propose a simple yet effective filter pruning method by evaluating the importance of each filter based on the BN parameters of pre-trained CNNs. The experimental results on CIFAR-10 and ImageNet demonstrate that the proposed method can achieve outstanding performance with and without fine-tuning in terms of the trade-off between the accuracy drop and the reduction in computational complexity and number of parameters of pruned networks.
翻译:过滤处理的目的是寻找不重要的过滤器,以便在不牺牲过程性能的情况下,使进化神经网络(CNNs)效率更高,挑战在于寻找信息,帮助确定每个过滤器对于神经网络的最终产出的重要性或相关性。在这项工作中,我们同意我们的意见,即预先训练的CNN的批量正常化参数可以用来估计激活产出的特性分布,而无需处理培训数据。在观察中,我们建议一种简单而有效的过滤处理方法,根据预先训练的CNNBN参数评估每个过滤器的重要性。CIFAR-10和图像网络的实验结果表明,拟议的方法可以取得杰出的性能,而无需在精确下降与计算复杂性和运行网络参数数目的减少之间作出微调。