Network pruning is a widely used technique to reduce computation cost and model size for deep neural networks. However, the typical three-stage pipeline, i.e., training, pruning and retraining (fine-tuning) significantly increases the overall training trails. In this paper, we develop a systematic weight-pruning optimization approach based on Surrogate Lagrangian relaxation (SLR), which is tailored to overcome difficulties caused by the discrete nature of the weight-pruning problem while ensuring fast convergence. We further accelerate the convergence of the SLR by using quadratic penalties. Model parameters obtained by SLR during the training phase are much closer to their optimal values as compared to those obtained by other state-of-the-art methods. We evaluate the proposed method on image classification tasks, i.e., ResNet-18 and ResNet-50 using ImageNet, and ResNet-18, ResNet-50 and VGG-16 using CIFAR-10, as well as object detection tasks, i.e., YOLOv3 and YOLOv3-tiny using COCO 2014 and Ultra-Fast-Lane-Detection using TuSimple lane detection dataset. Experimental results demonstrate that our SLR-based weight-pruning optimization approach achieves higher compression rate than state-of-the-arts under the same accuracy requirement. It also achieves a high model accuracy even at the hard-pruning stage without retraining (reduces the traditional three-stage pruning to two-stage). Given a limited budget of retraining epochs, our approach quickly recovers the model accuracy.
翻译:网络修剪是一种广泛使用的降低深神经网络计算成本和模型规模的技术,但典型的三阶段输油管,即培训、修剪和再培训(微调),大大提升了总体培训轨迹。在本文中,我们根据Surrogate Lagrangian 放松(SLR)制定了系统性的加权优化方法,该方法旨在克服因重压问题离散性质造成的困难,同时确保快速趋同。我们通过使用四级处罚进一步加快SLR的趋同。SLR在培训阶段获得的模型参数比其他最新方法获得的更接近于其最优化值。我们利用图像网(ResNet-18)和ResNet-50和VGG-16,利用CIFAR-10,以及物体探测任务,例如,YOLOv3和YOLO3-tiny在2014年COCO和Ultra-Fast-Arrupal-ral-ral-ral-ral-ral-ral-ral-ral-ral-ral-ral-ral-ral-ral-ral-ral-ral-ral-ral-ral-ral-ral-ral-ral-ral-ral-al-ral-ral-ral-al-al-al-ral-ral-la-s)的高级测试方法,在S-ral-ral-al-s-ral-al-ral-ral-ral-ral-s-ral-s-s-s-laking-al-al-al-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-s-al-al-al-al-al-s-laking-s-s-s-s-s-s-sal-sal-sal-ral-sal-ral-ral-ral-s-s-s-s-s-la-s-s-s-sal-la-s-l-s-la-s-la-la-la-la-sal-sal-sal-ral-sal-sal-l-