Pruning neural networks reduces inference time and memory costs. On standard hardware, these benefits will be especially prominent if coarse-grained structures, like feature maps, are pruned. We devise two novel saliency-based methods for second-order structured pruning (SOSP) which include correlations among all structures and layers. Our main method SOSP-H employs an innovative second-order approximation, which enables saliency evaluations by fast Hessian-vector products. SOSP-H thereby scales like a first-order method despite taking into account the full Hessian. We validate SOSP-H by comparing it to our second method SOSP-I that uses a well-established Hessian approximation, and to numerous state-of-the-art methods. While SOSP-H performs on par or better in terms of accuracy, it has clear advantages in terms of scalability and efficiency. This allowed us to scale SOSP-H to large-scale vision tasks, even though it captures correlations across all layers of the network. To underscore the global nature of our pruning methods, we evaluate their performance not only by removing structures from a pretrained network, but also by detecting architectural bottlenecks. We show that our algorithms allow to systematically reveal architectural bottlenecks, which we then remove to further increase the accuracy of the networks.
翻译:在标准硬件上,这些好处将特别显著,如果粗粗粗结构(如地貌地图)被修剪。我们为二级结构裁剪(SOSP)设计了两种新颖的突出方法,其中包括所有结构和层次之间的相互关系。我们的主要方法SOSP-H采用了创新的二级近似方法,通过速速速海珊-摄像产品来进行显著评价。SOSP-H因此,它像一阶方法一样,尽管考虑到整个赫西安人。我们验证SOSP-H,方法是将它与我们第二个方法(SOSP-I)进行比较,该方法使用成熟的赫西安近似和许多最先进的方法。虽然SOSP-H在准确性方面表现得相当或更好,但它在伸缩性和效率方面有着明显的优势。这使我们能够将SOSP-H扩大到大型的愿景任务,尽管它能反映整个网络各个层次的相互关系。我们通过比较我们的运行方法的全球性质来验证SOSP-H。我们评估其性能如何提高我们的建筑网络的准确性,我们通过系统化的升级来显示我们的建筑网络的升级。