当前位置: X-MOL 学术Inform. Sci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Pruning Deep Convolutional Neural Networks Architectures with Evolution Strategy
Information Sciences Pub Date : 2020-11-28 , DOI: 10.1016/j.ins.2020.11.009
Francisco E. Fernandes Jr. , Gary G. Yen

Currently, Deep Convolutional Neural Networks (DCNNs) are used to solve all kinds of problems in the field of machine learning and artificial intelligence due to their learning and adaptation capabilities. However, most successful DCNN models have a high computational complexity making them difficult to deploy on mobile or embedded platforms. This problem has prompted many researchers to develop algorithms and approaches to help reduce the computational complexity of such models. One of them is called filter pruning, where convolution filters are eliminated to reduce the number of parameters and, consequently, the computational complexity of the given model. In the present work, we propose a novel algorithm to perform filter pruning by using a Multi-Objective Evolution Strategy (MOES) algorithm, called DeepPruningES. Our approach avoids the need for using any knowledge during the pruning procedure and helps decision-makers by returning three pruned CNN models with different trade-offs between performance and computational complexity. We show that DeepPruningES can significantly reduce a model’s computational complexity by testing it on three DCNN architectures: Convolutional Neural Networks (CNNs), Residual Neural Networks (ResNets), and Densely Connected Neural Networks (DenseNets).



中文翻译:

用进化策略修剪深层卷积神经网络架构

目前,深层卷积神经网络(DCNN)由于其学习和适应能力而被用于解决机器学习和人工智能领域中的各种问题。但是,大多数成功的DCNN模型具有很高的计算复杂度,因此很难在移动或嵌入式平台上进行部署。这个问题促使许多研究人员开发算法和方法来帮助降低此类模型的计算复杂性。其中之一称为过滤器修剪,其中消除了卷积过滤器以减少参数的数量,从而减少给定模型的计算复杂性。在当前的工作中,我们提出了一种新颖的算法,该算法通过使用多目标进化策略(MOES)算法来执行过滤器修剪,称为DeepPruningES。我们的方法避免了在修剪过程中使用任何知识的需求,并通过返回三个修剪的CNN模型(在性能和计算复杂度之间进行折衷)来帮助决策者。我们展示了DeepPruningES通过在三种DCNN架构上对其进行测试来显着降低模型的计算复杂度:卷积神经网络(CNN),残差神经网络(ResNets)和密集连接神经网络(DenseNets)。

更新日期:2020-12-22
down
wechat
bug