Currently, Deep Convolutional Neural Networks (DCNNs) are used to solve all kinds of problems in the field of machine learning and artificial intelligence due to their learning and adaptation capabilities. However, most successful DCNN models have a high computational complexity making them difficult to deploy on mobile or embedded platforms. This problem has prompted many researchers to develop algorithms and approaches to help reduce the computational complexity of such models. One of them is called filter pruning, where convolution filters are eliminated to reduce the number of parameters and, consequently, the computational complexity of the given model. In the present work, we propose a novel algorithm to perform filter pruning by using Multi-Objective Evolution Strategy (ES) algorithm, called DeepPruningES. Our approach avoids the need for using any knowledge during the pruning procedure and helps decision-makers by returning three pruned CNN models with different trade-offs between performance and computational complexity. We show that DeepPruningES can significantly reduce a model's computational complexity by testing it on three DCNN architectures: Convolutional Neural Networks (CNNs), Residual Neural Networks (ResNets), and Densely Connected Neural Networks (DenseNets).
翻译:目前,深革命神经网络(DCNN)被用于解决机器学习和人工智能领域的各种问题,因为其学习和适应能力。然而,大多数成功的DCNN模型具有很高的计算复杂性,难以在移动平台或嵌入平台上部署。这个问题促使许多研究人员开发算法和办法,帮助减少这些模型的计算复杂性。其中之一是过滤程序,即消除熔化过滤器,以减少参数数量,从而减少给定模型的计算复杂性。在目前的工作中,我们提议一种新型算法,通过使用多目标进化战略算法(ES)进行过滤处理,称为深目标进化。我们的方法避免了在运行过程中使用任何知识的需要,并且帮助决策者返回三种运行的CNN模型,这些模型在性能和计算复杂度之间有着不同的取舍。我们表明,DeepPruningES可以通过测试三个DCNNN结构:进化神经网络(CNNE)、内存网络(NERS)和内存网络(NEMENNets)。