Channel pruning has made major headway in the design of efficient deep learning models. Conventional approaches adopt human-made pruning functions to score channels' importance for channel pruning, which requires domain knowledge and could be sub-optimal. In this work, we propose an end-to-end framework to automatically discover strong pruning metrics. Specifically, we craft a novel design space for expressing pruning functions and leverage an evolution strategy, genetic programming, to evolve high-quality and transferable pruning functions. Unlike prior methods, our approach can not only provide compact pruned networks for efficient inference, but also novel closed-form pruning metrics that are mathematically explainable and thus generalizable to different pruning tasks. The evolution is conducted on small datasets while the learned functions are transferable to larger datasets without any manual modification. Compared to direct evolution on a large dataset, our strategy shows better cost-effectiveness. When applied to more challenging datasets, different from those used in the evolution process, e.g., ILSVRC-2012, an evolved function achieves state-of-the-art pruning results.
翻译:频道运行在设计高效深层学习模型方面取得了重大进展。 常规方法采用人为的运行功能来评分频道运行重要性, 这需要域知识, 并且可能是亚最佳的。 在这项工作中, 我们提出一个端对端框架, 自动发现强大的运行度量。 具体地说, 我们为表达运行功能和运用进化战略、 基因编程, 开发出高质量的和可转移的运行功能, 创造了新的设计空间。 与以往的方法不同, 我们的方法不仅可以提供精细的运行网络, 用于高效推断, 还可以提供新型的闭式运行度量度, 这些功能在数学上可以解释, 因而可以概括到不同的运行任务 。 进化的功能是在小数据集上进行演进, 而所学的功能在没有任何手动修改的情况下可转移到更大的数据集 。 与大型数据集的直接演进相比, 我们的战略显示出更好的成本效益。 当应用于更具挑战性的数据集时, 与进化过程中所使用的数据集不同, 比如, ILSVRC- 2012, 一个进化的函数可以实现状态运行结果 。