当前位置: X-MOL 学术IEEE Trans. Cybern. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Semisupervised Multiple Choice Learning for Ensemble Classification
IEEE Transactions on Cybernetics ( IF 9.4 ) Pub Date : 9-14-2020 , DOI: 10.1109/tcyb.2020.3016048
Jian Zhong 1 , Xiangping Zeng 1 , Wenming Cao 2 , Si Wu 1 , Cheng Liu 3 , Zhiwen Yu 1 , Hau-San Wong 2
Affiliation  

Ensemble learning has many successful applications because of its effectiveness in boosting the predictive performance of classification models. In this article, we propose a semisupervised multiple choice learning (SemiMCL) approach to jointly train a network ensemble on partially labeled data. Our model mainly focuses on improving a labeled data assignment among the constituent networks and exploiting unlabeled data to capture domain-specific information, such that semisupervised classification can be effectively facilitated. Different from conventional multiple choice learning models, the constituent networks learn multiple tasks in the training process. Specifically, an auxiliary reconstruction task is included to learn domain-specific representation. For the purpose of performing implicit labeling on reliable unlabeled samples, we adopt a negative _1\ell _{1} -norm regularization when minimizing the conditional entropy with respect to the posterior probability distribution. Extensive experiments on multiple real-world datasets are conducted to verify the effectiveness and superiority of the proposed SemiMCL model.

中文翻译:


集成分类的半监督多项选择学习



集成学习因其在提高分类模型的预测性能方面的有效性而拥有许多成功的应用。在本文中,我们提出了一种半监督多项选择学习(SemiMCL)方法,用于在部分标记数据上联合训练网络集成。我们的模型主要致力于改进组成网络之间的标记数据分配,并利用未标记数据来捕获特定领域的信息,从而可以有效地促进半监督分类。与传统的多项选择学习模型不同,组成网络在训练过程中学习多个任务。具体来说,包括辅助重建任务来学习特定领域的表示。为了对可靠的未标记样本进行隐式标记,我们在最小化后验概率分布的条件熵时采用负 _1\ell _{1} -范数正则化。对多个真实世界数据集进行了广泛的实验,以验证所提出的 SemiMCL 模型的有效性和优越性。
更新日期:2024-08-22
down
wechat
bug