当前位置: X-MOL 学术J. Stat. Comput. Simul. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A majorization-minimization scheme for L2 support vector regression
Journal of Statistical Computation and Simulation ( IF 1.2 ) Pub Date : 2021-04-28 , DOI: 10.1080/00949655.2021.1918691
Songfeng Zheng 1
Affiliation  

In a support vector regression (SVR) model, using the squared ϵ-insensitive loss function makes the optimization problem strictly convex and yields a more concise solution. However, the formulation of L2-SVR leads to a quadratic programming which is expensive to solve. This paper reformulates the optimization problem of L2-SVR by absorbing the constraints in the objective function, which can be solved efficiently by a majorization-minimization approach, in which an upper bound for the objective function is derived in each iteration which is easier to be minimized. The proposed approach is easy to implement, without requiring any additional computing package other than basic linear algebra operations. Numerical studies on real-world datasets show that, compared to the alternatives, the proposed approach can achieve similar prediction accuracy with substantially higher time efficiency in training.



中文翻译:

L2支持向量回归的主最小化方案

在支持向量回归 (SVR) 模型中,使用平方ϵ 不敏感损失函数使优化问题严格凸,并产生更简洁的解决方案。然而,其配方2-SVR 导致求解代价高昂的二次规划。本文重新表述了优化问题2-SVR 通过吸收目标函数中的约束,这可以通过主最小化方法有效地解决,其中在每次迭代中导出目标函数的上限,更容易最小化。所提出的方法易于实现,除了基本的线性代数运算之外,不需要任何额外的计算包。对真实世界数据集的数值研究表明,与替代方法相比,所提出的方法可以实现相似的预测精度,而且训练的时间效率显着提高。

更新日期:2021-04-28
down
wechat
bug