当前位置: X-MOL 学术Neurocomputing › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
CCPrune: Collaborative channel pruning for learning compact convolutional networks
Neurocomputing ( IF 6 ) Pub Date : 2021-04-23 , DOI: 10.1016/j.neucom.2021.04.063
Yanming Chen , Xiang Wen , Yiwen Zhang , Weisong Shi

Deep convolutional neural networks (CNNs) is difficult to deploy on resource-constrained devices due to its huge amount of computation. Channel pruning is an effective method to reduce the amount of computation and accelerate network inference. Most of channels pruning methods use statistics from a single structure (convolutional layer or batch normalization layer) of the sparse network to evaluate the importance of channels. The limitation of these methods is that it may often mistakenly delete the important channels. In view of this, we propose a novel method, namely Collaborative Channel Pruning (CCPrune), to evaluate the importance of channels, which combines the convolution layer weights and the BN layer scaling factors. The proposed method first introduces the regularization on the convolution layer weights and the BN layer scaling factors respectively. Then combine the weight of the convolutional layer and the scaling factor of the BN layer to evaluate the importance of the channel. Finally, it can delete the unimportant channels without reduces the performance of the model. The experimental results well demonstrate the effectiveness of our method. On CIFAR-10, it can reduce the FLOPs of VGG-19 by 85.50% while only slightly reducing the accuracy of the model, and it can reduce the FLOPs of Resnet-50 by 78.31% without reducing the accuracy of the model, respectively.



中文翻译:

CCPrune:协作通道修剪,用于学习紧凑型卷积网络

由于深度卷积神经网络(CNN)的大量计算,因此难以在资源受限的设备上部署。信道修剪是减少计算量并加速网络推断的有效方法。大多数通道修剪方法都使用稀疏网络的单个结构(卷积层或批归一化层)的统计信息来评估通道的重要性。这些方法的局限性在于它可能经常错误地删除重要频道。有鉴于此,我们提出了一种新颖的方法,即协作信道修剪(CCPrune),以评估信道的重要性,该方法结合了卷积层权重和BN层缩放因子。所提出的方法首先在卷积层权重和BN层缩放因子上分别引入了正则化。然后结合卷积层的权重和BN层的缩放因子,以评估信道的重要性。最后,它可以删除不重要的通道,而不会降低模型的性能。实验结果很好地证明了我们方法的有效性。在CIFAR-10上,它可以将VGG-19的FLOP降低85.50%,而仅稍微降低模型的精度,并且可以在不降低模型精度的情况下分别将Resnet-50的FLOPs降低78.31%。它可以删除不重要的通道,而不会降低模型的性能。实验结果很好地证明了我们方法的有效性。在CIFAR-10上,它可以将VGG-19的FLOP降低85.50%,而仅稍微降低模型的精度,并且可以在不降低模型精度的情况下分别将Resnet-50的FLOPs降低78.31%。它可以删除不重要的通道,而不会降低模型的性能。实验结果很好地证明了我们方法的有效性。在CIFAR-10上,它可以将VGG-19的FLOP降低85.50%,而仅稍微降低模型的精度,并且可以在不降低模型精度的情况下分别将Resnet-50的FLOPs降低78.31%。

更新日期:2021-05-05
down
wechat
bug