当前位置: X-MOL 学术Ann. Math. Artif. Intel. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Least squares approach to K-SVCR multi-class classification with its applications
Annals of Mathematics and Artificial Intelligence ( IF 1.2 ) Pub Date : 2021-06-21 , DOI: 10.1007/s10472-021-09747-1
Hossein Moosaei , Milan Hladík

The support vector classification-regression machine for K-class classification (K-SVCR) is a novel multi-class classification method based on the “1-versus-1-versus-rest” structure. In this paper, we propose a least squares version of K-SVCR named LSK-SVCR. Similarly to the K-SVCR algorithm, this method assesses all the training data into a “1-versus-1-versus-rest” structure, so that the algorithm generates ternary outputs {− 1,0,+ 1}. In LSK-SVCR, the solution of the primal problem is computed by solving only one system of linear equations instead of solving the dual problem, which is a convex quadratic programming problem in K-SVCR. Experimental results on several benchmark, MC-NDC, and handwritten digit recognition data sets show that not only does the LSK-SVCR have better performance in the aspects of classification accuracy to that of K-SVCR and Twin-KSVC algorithms but also has remarkably higher learning speed.



中文翻译:

K-SVCR多类分类的最小二乘法及其应用

K类分类支持向量分类回归机(K-SVCR)是一种基于“1-versus-1-versus-rest”结构的新型多类分类方法。在本文中,我们提出了 K-SVCR 的最小二乘版本,名为 LSK-SVCR。与 K-SVCR 算法类似,该方法将所有训练数据评估为“1-versus-1-versus-rest”结构,从而使算法生成三元输出{− 1,0,+ 1}。在 LSK-SVCR 中,原始问题的解是通过只求解一个线性方程组来计算的,而不是求解对偶问题,这是 K-SVCR 中的凸二次规划问题。几个基准的实验结果,MC-NDC,

更新日期:2021-06-22
down
wechat
bug