当前位置: X-MOL 学术Pattern Recogn. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Certainty driven consistency loss on multi-teacher networks for semi-supervised learning
Pattern Recognition ( IF 8 ) Pub Date : 2021-07-08 , DOI: 10.1016/j.patcog.2021.108140
Lu Liu 1 , Robby T. Tan 1, 2
Affiliation  

One of the successful approaches in semi-supervised learning is based on the consistency regularization. Typically, a student model is trained to be consistent with teacher prediction for the inputs under different perturbations. To be successful, the prediction targets given by teacher should have good quality, otherwise the student can be misled by teacher. Unfortunately, existing methods do not assess the quality of the teacher targets. In this paper, we propose a novel Certainty-driven Consistency Loss (CCL) that exploits the predictive uncertainty in the consistency loss to let the student dynamically learn from reliable targets. Specifically, we propose two approaches, i.e. Filtering CCL and Temperature CCL to either filter out uncertain predictions or pay less attention on them in the consistency regularization. We further introduce a novel decoupled framework to encourage model difference. Experimental results on SVHN, CIFAR-10, and CIFAR-100 demonstrate the advantages of our method over a few existing methods.



中文翻译:

用于半监督学习的多教师网络上的确定性驱动一致性损失

半监督学习的成功方法之一是基于一致性正则化。通常,学生模型被训练为与教师对不同扰动下的输入的预测一致。要想成功,老师给出的预测目标一定要质量好,否则很容易被老师误导。不幸的是,现有方法无法评估教师目标的质量。在本文中,我们提出了一种新颖的确定性驱动的一致性损失(CCL),它利用一致性损失中的预测不确定性让学生从可靠的目标中动态学习。具体来说,我们提出了两种方法,即过滤 CCL 和温度 CCL,要么过滤掉不确定的预测,要么在一致性正则化中较少关注它们。我们进一步引入了一种新颖的解耦框架来鼓励模型差异。在 SVHN、CIFAR-10 和 CIFAR-100 上的实验结果证明了我们的方法相对于一些现有方法的优势。

更新日期:2021-07-16
down
wechat
bug