当前位置: X-MOL 学术Pattern Recogn. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Unsupervised feature selection by self-paced learning regularization
Pattern Recognition Letters ( IF 5.1 ) Pub Date : 2018-06-28 , DOI: 10.1016/j.patrec.2018.06.029
Wei Zheng , Xiaofeng Zhu , Guoqiu Wen , Yonghua Zhu , Hao Yu , Jiangzhang Gan

Previous feature selection methods equivalently consider the samples to select important features. However, the samples are often diverse. For example, the outliers should have small or even zero weights while the important samples should have large weights. In this paper, we add a self-paced regularization in the sparse feature selection model to reduce the impact of outliers for conducting feature selection. Specifically, the proposed method automatically selects a sample subset which includes the most important samples to build an initial feature selection model, whose generalization ability is then improved by involving other important samples until a robust and generalized feature selection model has been established or all the samples have been used. Experimental results on eight real datasets show that the proposed method outperforms the comparison methods.



中文翻译:

通过自定进度的学习正则化进行无监督的特征选择

以前的特征选择方法等效地考虑了样本以选择重要特征。但是,样本通常是多种多样的。例如,离群值应具有较小或什至零的权重,而重要样本应具有较大的权重。在本文中,我们在稀疏特征选择模型中添加了一个自定进度的正则化方法,以减少离群值对进行特征选择的影响。具体而言,所提出的方法自动选择包括最重要样本的样本子集以构建初始特征选择模型,然后通过包含其他重要样本直到建立了鲁棒且通用的特征选择模型或所有样本来提高其泛化能力被用过。

更新日期:2020-03-20
down
wechat
bug