当前位置: X-MOL 学术Knowl. Based Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Multi-task analysis discriminative dictionary learning for one-class learning
Knowledge-Based Systems ( IF 7.2 ) Pub Date : 2021-06-05 , DOI: 10.1016/j.knosys.2021.107195
Bo Liu , Haoxin Xie , Yanshan Xiao

One-class classification is a generalization of supervised learning based on one class of examples. It attracts growing attention in machine learning and data mining. In this paper, we propose a novel approach called multi-task dictionary learning for one-class learning (MTD-OC), which incorporates analysis discriminative dictionary learning into one-class learning. The analysis discriminative dictionary learning makes sure that dictionaries responding to different tasks are independent and discriminating as much as possible. The analysis discriminative dictionary learning simultaneously minimize l2,1-norm constraint, analysis incoherence term and sparse code extraction term, which aim to promote analysis incoherence and improve coding efficiency and accuracy for classification. The one-class classifier on the target task is then constructed by learning transfer knowledge from multiple source tasks. Here, one-class classification improves the performance of analysis discriminative dictionary, while analysis discriminative dictionary improves the performance of one-class classification term. In MTD-OC, the optimization function is formulated to deal with one-class classifier and analysis discriminative dictionary learning based on one class of examples. Then, we propose an iterative framework to solve the optimization function, and obtain the predictive classifier for the target class. Extensive experiments have shown that MTD-OC can improve the accuracy of one-class classifier by learning analysis discriminative dictionary from each task to construct a transfer classifier.



中文翻译:

一类学习的多任务分析判别字典学习

一类分类是基于一类示例的监督学习的泛化。它在机器学习和数据挖掘领域引起了越来越多的关注。在本文中,我们提出了一种称为多任务字典学习用于一类学习(MTD-OC)的新方法,它将分析判别字典学习合并到一类学习中。分析判别字典学习确保响应不同任务的字典是独立的,并且尽可能具有判别性。分析判别字典学习同时最小化2,1-范数约束,分析不连贯项和稀疏编码提取项,旨在促进分析不连贯,提高编码效率和分类准确度。然后通过从多个源任务中学习迁移知识来构建目标任务上的一类分类器。这里,一类分类提高了分析判别字典的性能,而分析判别字典提高了一类分类词的性能。在 MTD-OC 中,优化函数被制定为处理一类分类器和基于一类示例的分析判别字典学习。然后,我们提出了一个迭代框架来求解优化函数,并获得目标类的预测分类器。

更新日期:2021-06-08
down
wechat
bug