当前位置: X-MOL 学术Optim. Methods Softw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
An efficient augmented Lagrangian method for support vector machine
Optimization Methods & Software ( IF 1.4 ) Pub Date : 2020-03-04 , DOI: 10.1080/10556788.2020.1734002
Yinqiao Yan 1 , Qingna Li 2
Affiliation  

Support vector machine (SVM) has proved to be a successful approach for machine learning. Two typical SVM models are the L1-loss model for support vector classification (SVC) and ε-L1-loss model for support vector regression (SVR). Due to the non-smoothness of the L1-loss function in the two models, most of the traditional approaches focus on solving the dual problem. In this paper, we propose an augmented Lagrangian method for the L1-loss model, which is designed to solve the primal problem. By tackling the non-smooth term in the model with Moreau–Yosida regularization and the proximal operator, the subproblem in augmented Lagrangian method reduces to a non-smooth linear system, which can be solved via the quadratically convergent semismooth Newton's method. Moreover, the high computational cost in semismooth Newton's method can be significantly reduced by exploring the sparse structure in the generalized Jacobian. Numerical results on various datasets in LIBLINEAR show that the proposed method is competitive with the most popular solvers in both speed and accuracy.



中文翻译:

支持向量机的高效扩充拉格朗日方法

支持向量机(SVM)已被证明是机器学习的成功方法。两种典型的SVM模型是用于支持向量分类(SVC)的L1-损失模型和ε支持向量回归(SVR)的-L1损失模型。由于两个模型中L1损失函数的不平滑性,大多数传统方法都集中于解决对偶问题。在本文中,我们为L1损失模型提出了一种增强的拉格朗日方法,旨在解决原始问题。通过用Moreau-Yosida正则化和近端算子处理模型中的非光滑项,增强拉格朗日方法中的子问题可简化为非光滑线性系统,可通过二次收敛半光滑牛顿法解决。此外,通过探索广义雅可比矩阵中的稀疏结构,可以大大减少半光滑牛顿法的高计算成本。

更新日期:2020-03-04
down
wechat
bug