当前位置: X-MOL 学术arXiv.cs.NE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Pruning Convolutional Neural Networks with Self-Supervision
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2020-01-10 , DOI: arxiv-2001.03554
Mathilde Caron, Ari Morcos, Piotr Bojanowski, Julien Mairal and Armand Joulin

Convolutional neural networks trained without supervision come close to matching performance with supervised pre-training, but sometimes at the cost of an even higher number of parameters. Extracting subnetworks from these large unsupervised convnets with preserved performance is of particular interest to make them less computationally intensive. Typical pruning methods operate during training on a task while trying to maintain the performance of the pruned network on the same task. However, in self-supervised feature learning, the training objective is agnostic on the representation transferability to downstream tasks. Thus, preserving performance for this objective does not ensure that the pruned subnetwork remains effective for solving downstream tasks. In this work, we investigate the use of standard pruning methods, developed primarily for supervised learning, for networks trained without labels (i.e. on self-supervised tasks). We show that pruned masks obtained with or without labels reach comparable performance when re-trained on labels, suggesting that pruning operates similarly for self-supervised and supervised learning. Interestingly, we also find that pruning preserves the transfer performance of self-supervised subnetwork representations.

中文翻译:

用自我监督修剪卷积神经网络

在没有监督的情况下训练的卷积神经网络接近于与有监督的预训练相匹配的性能,但有时会以更多的参数为代价。从这些具有保留性能的大型无监督 convnets 中提取子网络特别令人感兴趣,以减少它们的计算密集度。典型的剪枝方法在任务训练期间运行,同时试图保持剪枝网络在同一任务上的性能。然而,在自监督特征学习中,训练目标与下游任务的表示可转移性无关。因此,保持该目标的性能并不能确保修剪后的子网络在解决下游任务时仍然有效。在这项工作中,我们调查了标准修剪方法的使用,主要为监督学习而开发,用于无标签训练的网络(即自监督任务)。我们表明,在标签上重新训练时,无论有没有标签获得的修剪过的掩码都达到了可比的性能,这表明修剪对于自我监督和监督学习的运作方式相似。有趣的是,我们还发现修剪保留了自监督子网络表示的传输性能。
更新日期:2020-01-13
down
wechat
bug