当前位置: X-MOL 学术IEEE Access › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A Self-Ensemble Approach for Partial Multi-Label Learning
IEEE Access ( IF 3.9 ) Pub Date : 2020-01-01 , DOI: 10.1109/access.2020.2981389
Yan Yan , Shining Li

Partial multi-label learning (PML), which tackles the problem where each training instance is associated with multiple candidate labels which only a subset are valid. In this paper, we propose a simple but effective batch-wise PML model, PML-SE, which tackles PML problem with a self-ensemble approach (SE), namely the ensembles of model and predictions. Specially, PML-SE introduces a teacher model to refine a more reliable soft label matrix of each training batch by iteratively ensembling the current learned prediction network with the formal one in an online manner. Besides, it adopts a MixUp data augmentation scheme to enhance the robustness of the prediction network against the redundant irrelevant labels. In addition, we form self-ensemble label predictions through a consistency cost to boost the performance of the prediction network. Extensive experiments are conducted on synthesized and real-world PML datasets, while the proposed approach demonstrates the state-of-the-art performance for partial multi-label learning.

中文翻译:

一种用于部分多标签学习的自集成方法

部分多标签学习 (PML),它解决了每个训练实例与多个候选标签相关联的问题,其中只有一个子集是有效的。在本文中,我们提出了一个简单但有效的批量 PML 模型 PML-SE,它使用自集成方法 (SE),即模型和预测的集成来解决 PML 问题。特别地,PML-SE 引入了一个教师模型,通过以在线方式将当前学习的预测网络与正式的预测网络迭代集成来细化每个训练批次的更可靠的软标签矩阵。此外,它采用 MixUp 数据增强方案来增强预测网络对冗余无关标签的鲁棒性。此外,我们通过一致性成本形成自集成标签预测,以提高预测网络的性能。
更新日期:2020-01-01
down
wechat
bug