当前位置: X-MOL 学术IEEE Signal Process. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Lightweight Deep CNN for Natural Image Matting via Similarity-Preserving Knowledge Distillation
IEEE Signal Processing Letters ( IF 3.2 ) Pub Date : 2020-01-01 , DOI: 10.1109/lsp.2020.3039952
Donggeun Yoon , Jinsun Park , Donghyeon Cho

Recently, alpha matting has witnessed remarkable growth by wide and deep convolutional neural networks. However, previous deep learning-based alpha matting methods require a high computational cost to be used in real environments including mobile devices. In this letter, a lightweight natural image matting network with a similarity-preserving knowledge distillation is developed. The similarity-preserving knowledge distillation makes pairwise similarities from a compact student network similar to those from a teacher network. The pairwise similarity measured on spatial, channel, and batch units enables to transfer knowledge of the teacher to the student. Based on the similarity-preserving knowledge distillation, we not only design a lighter and smaller student network than the teacher one but also achieve superior performance compared to that of the student network without the knowledge distillation. In addition, the proposed algorithm can be seamlessly applied to various deep image matting algorithms. Therefore, our algorithm is effective for mobile applications (e.g., human portrait matting), which are in growing demand. The effectiveness of the proposed algorithm is verified on two public benchmark datasets.

中文翻译:

通过相似性保留知识蒸馏用于自然图像抠图的轻量级深度 CNN

最近,广泛而深度的卷积神经网络见证了 alpha matting 的显着增长。然而,以前基于深度学习的 alpha matting 方法需要很高的计算成本才能在包括移动设备在内的真实环境中使用。在这封信中,开发了一种具有相似性保留知识蒸馏的轻量级自然图像抠图网络。相似性保留知识蒸馏使来自紧凑学生网络的成对相似性与来自教师网络的相似性相似。在空间、通道和批处理单元上测量的成对相似性能够将教师的知识传递给学生。基于相似性保持知识蒸馏,我们不仅设计了一个比教师网络更轻、更小的学生网络,而且与没有知识蒸馏的学生网络相比,还实现了更优越的性能。此外,所提出的算法可以无缝应用于各种深度图像抠图算法。因此,我们的算法对于需求不断增长的移动应用程序(例如,人物肖像抠图)是有效的。在两个公共基准数据集上验证了所提出算法的有效性。
更新日期:2020-01-01
down
wechat
bug