当前位置: X-MOL 学术Pattern Recogn. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Faster SVM Training via Conjugate SMO
Pattern Recognition ( IF 8 ) Pub Date : 2021-03-01 , DOI: 10.1016/j.patcog.2020.107644
Alberto Torres-Barrán , Carlos M. Alaíz , José R. Dorronsoro

We propose an improved version of the SMO algorithm for training classification and regression SVMs, based on a Conjugate Descent procedure. This new approach only involves a modest increase on the computational cost of each iteration but, in turn, usually results in a substantial decrease in the number of iterations required to converge to a given precision. Besides, we prove convergence of the iterates of this new Conjugate SMO as well as a linear rate when the kernel matrix is positive definite. We have implemented Conjugate SMO within the LIBSVM library and show experimentally that it is faster for many hyper-parameter configurations, being often a better option than second order SMO when performing a grid-search for SVM tuning.

中文翻译:

通过共轭 SMO 加快 SVM 训练

我们基于共轭下降过程提出了 SMO 算法的改进版本,用于训练分类和回归 SVM。这种新方法只涉及每次迭代的计算成本的适度增加,但通常会导致收敛到给定精度所需的迭代次数大幅减少。此外,当核矩阵为正定时,我们证明了这个新的共轭 SMO 迭代的收敛性以及线性速率。我们已经在 LIBSVM 库中实现了 Conjugate SMO,并通过实验表明它对于许多超参数配置更快,在执行网格搜索以进行 SVM 调整时通常是比二阶 SMO 更好的选择。
更新日期:2021-03-01
down
wechat
bug