Second-order information, in the form of Hessian- or Inverse-Hessian-vector products, is a fundamental tool for solving optimization problems. Recently, there has been significant interest in utilizing this information in the context of deep neural networks; however, relatively little is known about the quality of existing approximations in this context. Our work examines this question, identifies issues with existing approaches, and proposes a method called WoodFisher to compute a faithful and efficient estimate of the inverse Hessian. Our main application is to neural network compression, where we build on the classic Optimal Brain Damage/Surgeon framework. We demonstrate that WoodFisher significantly outperforms popular state-of-the-art methods for one-shot pruning. Further, even when iterative, gradual pruning is considered, our method results in a gain in test accuracy over the state-of-the-art approaches, for pruning popular neural networks (like ResNet-50, MobileNetV1) trained on standard image classification datasets such as ImageNet ILSVRC. We examine how our method can be extended to take into account first-order information, as well as illustrate its ability to automatically set layer-wise pruning thresholds and perform compression in the limited-data regime. The code is available at the following link, https://github.com/IST-DASLab/WoodFisher.
翻译:第二顺序信息,以Hessian或Invers-Hessian-Victor产品的形式,是解决优化问题的基本工具。最近,人们对在深层神经网络中利用这一信息的兴趣很大;然而,对于这方面的现有近似质量了解相对较少。我们的工作研究这一问题,找出现有方法的问题,并提议一种叫WoodFisher的方法,以忠实和高效地估算逆向赫瑟。我们的主要应用是神经网络压缩,我们在那里建立经典的“最佳脑损伤/外生”框架。我们证明,WoodFisher明显超越了在一线性神经网络中流行的“艺术状态”方法。此外,即使考虑过迭接、逐步的“调整”方法,我们的方法也使得人们能够测试现有“现代”方法的准确性,用于运行大众神经网络(如ResNet-50,移动NetV1), 用于进行神经网络压缩,我们在那里建立典型的图像网络网络网络/外生化框架。我们研究了Wisher Food-Art-destrual-deal-deal-deal-deal-deformal-defornal-de-deal-deal-de-de-de-de-de-deal-destrubal-s),我们的方法是如何可以自动地将数据连接到一个有限的数据序列。我们是如何的系统。我们是如何将数据系统。我们是如何将数据序列的系统-de-de-maismal-de-de-de-de-de-destrutismal-de-de-de-de-sal-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-deal-deal-deal-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-de-