当前位置: X-MOL 学术Pattern Recogn. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Feature selection with kernelized multi-class support vector machine
Pattern Recognition ( IF 7.5 ) Pub Date : 2021-04-20 , DOI: 10.1016/j.patcog.2021.107988
Yinan Guo , Zirui Zhang , Fengzhen Tang

Feature selection is an important procedure in machine learning because it can reduce the complexity of the final learning model and simplify the interpretation. In this paper, we propose a novel non-linear feature selection method that targets multi-class classification problems in the framework of support vector machines. The proposed method is achieved using a kernelized multi-class support vector machine with a fast version of recursive feature elimination. The proposed method selects features that work well for all classes, as the involved classifier simultaneously constructs multiple decision functions that separates each class from the others. We formulate the classifier as a large optimisation problem, and iteratively solve one decision function at a time, leading to a lower computational time complexity than when solving the large optimisation problem directly. The coefficients of the classifier are then used as a ranking criterion in the accelerated recursive feature elimination by adding batch elimination and a rechecking process. Experimental results on several datasets demonstrate the superior performance of the proposed feature selection method.



中文翻译:

带内核的多类支持向量机进行特征选择

特征选择是机器学习中的重要过程,因为它可以降低最终学习模型的复杂性并简化解释。在本文中,我们提出了一种新的非线性特征选择方法,该方法针对支持向量机框架中的多类分类问题。所提出的方法是使用具有递归特征消除功能的快速版本的带内核的多类支持向量机来实现的。所提出的方法选择对所有类别都适用的功能,因为所涉及的分类器同时构造了将每个类别与其他类别分隔开的多个决策函数。我们将分类器公式化为一个大型优化问题,并一次迭代地解决一个决策函数,与直接解决大型优化问题相比,可降低计算时间复杂度。然后,通过添加批次消除和重新检查过程,将分类器的系数用作加速递归特征消除中的排名标准。在多个数据集上的实验结果证明了所提出的特征选择方法的优越性能。

更新日期:2021-05-07
down
wechat
bug