In this paper, we propose a novel meta learning approach for automatic channel pruning of very deep neural networks. We first train a PruningNet, a kind of meta network, which is able to generate weight parameters for any pruned structure given the target network. We use a simple stochastic structure sampling method for training the PruningNet. Then, we apply an evolutionary procedure to search for good-performing pruned networks. The search is highly efficient because the weights are directly generated by the trained PruningNet and we do not need any finetuning. With a single PruningNet trained for the target network, we can search for various Pruned Networks under different constraints with little human participation. We have demonstrated competitive performances on MobileNet V1/V2 networks, up to 9.0/9.9 higher ImageNet accuracy than V1/V2. Compared to the previous state-of-the-art AutoML-based pruning methods, like AMC and NetAdapt, we achieve higher or comparable accuracy under various conditions.
翻译:在本文中,我们提出一种新型的元学习方法,用于对非常深层的神经网络进行自动频道运行。我们首先培训了PruningNet,这是一种元网络,能够为目标网络中的任何被切割的结构生成加权参数。我们使用简单的随机结构抽样方法来培训PruningNet。然后,我们运用进化程序来搜索业绩良好的经处理的网络。搜索效率很高,因为重量是由受过训练的PruningNet直接产生的,我们不需要任何微调。在为目标网络培训的单一PruningNet中,我们可以在不同的限制下搜索各种普鲁宁网络,而人类的参与则很少。我们在移动Net V1/V2网络上展示了竞争性的性能,比V1/V2.高至9.0.9.9.9的图像网络精度。与以前最先进的基于Automle的运行方法相比,像AMC和NetAdapt那样,我们在不同条件下实现了更高或可比的准确性。