Works on lottery ticket hypothesis (LTH) and single-shot network pruning (SNIP) have raised a lot of attention currently on post-training pruning (iterative magnitude pruning), and before-training pruning (pruning at initialization). The former method suffers from an extremely large computation cost and the latter usually struggles with insufficient performance. In comparison, during-training pruning, a class of pruning methods that simultaneously enjoys the training/inference efficiency and the comparable performance, temporarily, has been less explored. To better understand during-training pruning, we quantitatively study the effect of pruning throughout training from the perspective of pruning plasticity (the ability of the pruned networks to recover the original performance). Pruning plasticity can help explain several other empirical observations about neural network pruning in literature. We further find that pruning plasticity can be substantially improved by injecting a brain-inspired mechanism called neuroregeneration, i.e., to regenerate the same number of connections as pruned. We design a novel gradual magnitude pruning (GMP) method, named gradual pruning with zero-cost neuroregeneration (\textbf{GraNet}), that advances state of the art. Perhaps most impressively, its sparse-to-sparse version for the first time boosts the sparse-to-sparse training performance over various dense-to-sparse methods with ResNet-50 on ImageNet without extending the training time. We release all codes in https://github.com/Shiweiliuiiiiiii/GraNet.
翻译:彩票假设( LTH ) 和 单发网络运行( SNIP ) 的工程引起了目前对培训后修剪( 修剪规模 ) 和 培训前修剪( 初始化时修剪) 的极大关注。 前一种方法的计算成本极高,而后一种方法的运行通常与表现不足有关。相比之下,在修剪期间, 一组同时享有培训/ 发酵效率和可比较性能的修剪方法( SNIP ) 得到的探索较少。 为了更好地了解培训修剪剪剪裁期间, 我们从修剪剪裁塑料( 修剪剪剪的网络恢复原始性能的能力)的角度, 对整个培训期间修剪裁的影响进行了定量研究。 粗整的整型方法可以解释关于神经网络运行网络运行的其他一些实验性观察结果。 我们还发现,通过注入一个称为神经再造( e. e. ) 重调试的大脑机制, 可以大大改进整型的网络连接次数。 我们设计了一个新的渐进式的递缩缩精度, 将神经- 递化的精度培训的精细的精度 系统升级的升级的精度技术升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的升级的方法。