Channel (or 3D filter) pruning serves as an effective way to accelerate the inference of neural networks. There has been a flurry of algorithms that try to solve this practical problem, each being claimed effective in some ways. Yet, a benchmark to compare those algorithms directly is lacking, mainly due to the complexity of the algorithms and some custom settings such as the particular network configuration or training procedure. A fair benchmark is important for the further development of channel pruning. Meanwhile, recent investigations reveal that the channel configurations discovered by pruning algorithms are at least as important as the pre-trained weights. This gives channel pruning a new role, namely searching the optimal channel configuration. In this paper, we try to determine the channel configuration of the pruned models by random search. The proposed approach provides a new way to compare different methods, namely how well they behave compared with random pruning. We show that this simple strategy works quite well compared with other channel pruning methods. We also show that under this setting, there are surprisingly no clear winners among different channel importance evaluation methods, which then may tilt the research efforts into advanced channel configuration searching methods.
翻译:频道( 或 3D 过滤器) 修剪是加速神经网络推断的有效方法 。 已经存在一阵子的算法, 试图解决这个实际问题, 每一个都声称在某些方面是有效的 。 然而, 直接比较这些算法的基准缺乏, 主要是因为算法的复杂性和某些定制设置, 如特定的网络配置或培训程序 。 一个公平的基准对于进一步开发频道修剪很重要 。 同时, 最近的调查显示, 修剪算法所发现的频道配置至少与预修的重量一样重要 。 这给了频道修剪新功能, 即搜索最佳的频道配置 。 在此文件中, 我们试图通过随机搜索来确定这些修剪切模式的频道配置 。 拟议的方法为比较不同方法提供了一种新的方法, 即它们与随机剪裁程序相比, 表现得如何。 我们显示, 这个简单的策略与其他频道的修剪方法相比效果很好 。 我们还显示, 在这种设置下, 奇怪的是, 不同的频道重要性评价方法没有明显的赢家, 从而可以通过随机搜索方法将研究工作推向先进的频道配置方法 。