Soft threshold pruning is among the cutting-edge pruning methods with state-of-the-art performance. However, previous methods either perform aimless searching on the threshold scheduler or simply set the threshold trainable, lacking theoretical explanation from a unified perspective. In this work, we reformulate soft threshold pruning as an implicit optimization problem solved using the Iterative Shrinkage-Thresholding Algorithm (ISTA), a classic method from the fields of sparse recovery and compressed sensing. Under this theoretical framework, all threshold tuning strategies proposed in previous studies of soft threshold pruning are concluded as different styles of tuning $L_1$-regularization term. We further derive an optimal threshold scheduler through an in-depth study of threshold scheduling based on our framework. This scheduler keeps $L_1$-regularization coefficient stable, implying a time-invariant objective function from the perspective of optimization. In principle, the derived pruning algorithm could sparsify any mathematical model trained via SGD. We conduct extensive experiments and verify its state-of-the-art performance on both Artificial Neural Networks (ResNet-50 and MobileNet-V1) and Spiking Neural Networks (SEW ResNet-18) on ImageNet datasets. On the basis of this framework, we derive a family of pruning methods, including sparsify-during-training, early pruning, and pruning at initialization. The code is available at https://github.com/Yanqi-Chen/LATS.
翻译:软阈值运行是具有最先进性能的尖端裁剪方法之一。 但是, 先前的方法要么在阈值调度仪上进行无目标的无目标搜索, 要么只是设置门槛可训练的门槛值, 缺乏统一角度的理论解释 。 在这项工作中, 我们重新配置软阈值运行, 作为一种隐含的优化问题, 使用“ 循环最小化- 保持 Algorithm ( ISTA) ” (ISTA), 这是一种来自稀薄恢复和压缩感测领域的经典方法。 在这个理论框架内, 先前的软阈值运行研究中提议的所有阈值调整战略都以不同的调制 $L_ 1$- 常规化术语的形式完成 。 我们进一步通过基于我们框架的对阈值时间安排进行深入的研究, 来获取最佳的阈值运行阈值计。 这个表将保持 $L_ 1$- 常规化系数稳定, 从优化的角度意味着一个时间变量目标功能。 原则上, 衍生的调算法可以修饰任何通过 SGD培训的数学模型。 我们进行广泛的实验, 校验其状态的状态运行状态运行状态运行运行状态运行状态运行, 在Artificial- net- Net- IM- IM- IMS- IMS- IMS- IMS- IMO- 网络 的系统- 的系统- IMO- 的系统- s real- s real- real- sal- salbalbal- sal- commal- silver- silvealbilver- silver- silvibilbilbilbilvial- silvieward- silvial- silviewd- silvial- silvial- silbal- sal- sal- sil- silbal- sal- sal- sal- silbalbalbal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sil</s>