当前位置: X-MOL 学术IEEE Trans. Pattern Anal. Mach. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Efficient Training for Positive Unlabeled Learning.
IEEE Transactions on Pattern Analysis and Machine Intelligence ( IF 23.6 ) Pub Date : 2018-07-30 , DOI: 10.1109/tpami.2018.2860995
Emanuele Sansone , Francesco G B De Natale , Zhi-Hua Zhou

Positive unlabeled (PU) learning is useful in various practical situations, where there is a need to learn a classifier for a class of interest from an unlabeled data set, which may contain anomalies as well as samples from unknown classes. The learning task can be formulated as an optimization problem under the framework of statistical learning theory. Recent studies have theoretically analyzed its properties and generalization performance, nevertheless, little effort has been made to consider the problem of scalability, especially when large sets of unlabeled data are available. In this work we propose a novel scalable PU learning algorithm that is theoretically proven to provide the optimal solution, while showing superior computational and memory performance. Experimental evaluation confirms the theoretical evidence and shows that the proposed method can be successfully applied to a large variety of real-world problems involving PU learning.

中文翻译:

积极的无标签学习的有效培训。

积极的未标记(PU)学习在各种实际情况下很有用,在这种情况下,需要从未标记的数据集中学习感兴趣类别的分类器,该数据集可能包含异常以及未知类别的样本。可以在统计学习理论的框架下将学习任务表述为优化问题。最近的研究从理论上分析了它的属性和泛化性能,但是,很少考虑可伸缩性问题,尤其是在有大量未标记数据的情况下。在这项工作中,我们提出了一种新颖的可扩展PU学习算法,该算法在理论上被证明可提供最佳解决方案,同时显示出卓越的计算和存储性能。
更新日期:2019-10-23
down
wechat
bug