当前位置: X-MOL 学术IEEE Trans. Neural Netw. Learn. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Supervised Feature Selection With Orthogonal Regression and Feature Weighting.
IEEE Transactions on Neural Networks and Learning Systems ( IF 10.4 ) Pub Date : 2020-05-14 , DOI: 10.1109/tnnls.2020.2991336
Xia Wu , Xueyuan Xu , Jianhong Liu , Hailing Wang , Bin Hu , Feiping Nie

Effective features can improve the performance of a model and help us understand the characteristics and underlying structure of complex data. Previously proposed feature selection methods usually cannot retain more discriminative information. To address this shortcoming, we propose a novel supervised orthogonal least square regression model with feature weighting for feature selection. The optimization problem of the objective function can be solved by employing generalized power iteration and augmented Lagrangian multiplier methods. Experimental results show that the proposed method can more effectively reduce feature dimensionality and obtain better classification results than traditional feature selection methods. The convergence of our iterative method is also proved. Consequently, the effectiveness and superiority of the proposed method are verified both theoretically and experimentally.

中文翻译:

在正交回归和特征加权的监督下进行特征选择。

有效的功能可以改善模型的性能,并帮助我们了解复杂数据的特征和基础结构。先前提出的特征选择方法通常不能保留更多的判别信息。为了解决这个缺点,我们提出了一种新颖的监督正交最小二乘回归模型,该模型具有用于特征选择的特征加权。目标函数的优化问题可以通过采用广义幂迭代和增强拉格朗日乘数法来解决。实验结果表明,与传统特征选择方法相比,该方法可以更有效地降低特征维数,获得更好的分类效果。还证明了我们迭代方法的收敛性。所以,
更新日期:2020-05-14
down
wechat
bug