One-shot Network Pruning at Initialization (OPaI) is an effective method to decrease network pruning costs. Recently, there is a growing belief that data is unnecessary in OPaI. However, we obtain an opposite conclusion by ablation experiments in two representative OPaI methods, SNIP and GraSP. Specifically, we find that informative data is crucial to enhancing pruning performance. In this paper, we propose two novel methods, Discriminative One-shot Network Pruning (DOP) and Super Stitching, to prune the network by high-level visual discriminative image patches. Our contributions are as follows. (1) Extensive experiments reveal that OPaI is data-dependent. (2) Super Stitching performs significantly better than the original OPaI method on benchmark ImageNet, especially in a highly compressed model.
翻译:“一投网络在初始化时预留”(OPaI)是降低网络运行成本的有效方法。最近,人们日益认为OPaI没有必要提供数据。然而,我们通过在两种具有代表性的OPaI方法,即SNIP和GRASP中进行反动实验,得出了相反的结论。具体地说,我们发现,信息数据对于提高运行绩效至关重要。在本文中,我们提出了两种新颖的方法,即:有差异的一投网络预留(DOP)和超级缝纫(Super Stitching),用高层次的视觉歧视图像补补补利用网络。我们的贡献如下:(1)广泛的实验表明,OPaI依赖数据。(2)超级缝纫比OPaI在基准图像网络上最初采用的方法,特别是在高度压缩的模型中,效果要好得多。