Structural design of neural networks is crucial for the success of deep learning. While most prior works in evolutionary learning aim at directly searching the structure of a network, few attempts have been made on another promising track, channel pruning, which recently has made major headway in designing efficient deep learning models. In fact, prior pruning methods adopt human-made pruning functions to score a channel's importance for channel pruning, which requires domain knowledge and could be sub-optimal. To this end, we pioneer the use of genetic programming (GP) to discover strong pruning metrics automatically. Specifically, we craft a novel design space to express high-quality and transferable pruning functions, which ensures an end-to-end evolution process where no manual modification is needed on the evolved functions for their transferability after evolution. Unlike prior methods, our approach can provide both compact pruned networks for efficient inference and novel closed-form pruning metrics which are mathematically explainable and thus generalizable to different pruning tasks. While the evolution is conducted on small datasets, our functions shows promising results when applied to more challenging datasets, different from those used in the evolution process. For example, on ILSVRC-2012, an evolved function achieves state-of-the-art pruning results.
翻译:神经网络的结构设计对于深层学习的成功至关重要。 虽然大多数先前的进化学习工程都旨在直接搜索网络的结构, 但对于另一个有希望的轨道, 频道的修剪, 也很少尝试过。 事实上, 以前的修剪方法采用了人为的修剪功能来评分频道对频道修剪的重要性, 这需要域知识, 并且可能是亚最佳的。 为此, 我们率先使用基因编程( GP) 来自动发现强大的修剪度量度。 具体地说, 我们设计了一个新的设计空间来表达高质量的可转移的修剪功能, 以确保在设计高效深层学习模型方面, 最近取得了重大的进展。 事实上, 之前的修剪裁方法采用了人为的修剪功能, 以评断频道对管道的重要性, 这需要域内知识, 并且可能是次优的。 为此, 我们先使用基因编程( GPGP) 来自动发现强大的修剪裁量量量量度。 具体地说, 我们的功能在小的数据集上展示了有希望的结果, 当应用到更具有挑战性的进化的进化过程时, 。