当前位置: X-MOL 学术IEEE Trans. Neural Netw. Learn. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Probabilistic Knowledge Transfer for Lightweight Deep Representation Learning.
IEEE Transactions on Neural Networks and Learning Systems ( IF 10.2 ) Pub Date : 2020-06-01 , DOI: 10.1109/tnnls.2020.2995884
Nikolaos Passalis , Maria Tzelepi , Anastasios Tefas

Knowledge-transfer (KT) methods allow for transferring the knowledge contained in a large deep learning model into a more lightweight and faster model. However, the vast majority of existing KT approaches are designed to handle mainly classification and detection tasks. This limits their performance on other tasks, such as representation/metric learning. To overcome this limitation, a novel probabilistic KT (PKT) method is proposed in this article. PKT is capable of transferring the knowledge into a smaller student model by keeping as much information as possible, as expressed through the teacher model. The ability of the proposed method to use different kernels for estimating the probability distribution of the teacher and student models, along with the different divergence metrics that can be used for transferring the knowledge, allows for easily adapting the proposed method to different applications. PKT outperforms several existing state-of-the-art KT techniques, while it is capable of providing new insights into KT by enabling several novel applications, as it is demonstrated through extensive experiments on several challenging data sets.

中文翻译:

轻量级深度表示学习的概率知识转移。

知识转移(KT)方法允许将大型深度学习模型中包含的知识转移到更轻量和更快的模型中。然而,绝大多数现有的KT方法被设计为主要处理分类和检测任务。这限制了它们在其他任务上的表现,例如表示/度量学习。为了克服此限制,本文提出了一种新颖的概率KT(PKT)方法。PKT能够通过保留尽可能多的信息(如通过教师模型表达的信息),将知识转移到较小的学生模型中。所提出的方法使用不同的核估计教师和学生模型的概率分布的能力,以及可用于传递知识的不同差异度量,允许轻松地将建议的方法适应不同的应用。PKT胜过了几种现有的最先进的KT技术,同时它能够通过启用多种新颖的应用程序来提供有关KT的新见解,正如通过对多个具有挑战性的数据集进行的广泛实验所证明的那样。
更新日期:2020-06-01
down
wechat
bug