当前位置: X-MOL 学术Ann. Math. Artif. Intel. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Efficient implicit Lagrangian twin parametric insensitive support vector regression via unconstrained minimization problems
Annals of Mathematics and Artificial Intelligence ( IF 1.2 ) Pub Date : 2020-11-19 , DOI: 10.1007/s10472-020-09708-0
Deepak Gupta , Bharat Richhariya

In this paper, an efficient implicit Lagrangian twin parametric insensitive support vector regression is proposed which leads to a pair of unconstrained minimization problems, motivated by the works on twin parametric insensitive support vector regression (Peng: Neurocomputing. 79, 26–38, 2012), and Lagrangian twin support vector regression (Balasundaram and Tanveer: Neural Comput. Applic. 22(1), 257–267, 2013). Since its objective function is strongly convex, piece-wise quadratic and differentiable, it can be solved by gradient-based iterative methods. Notice that its objective function having non-smooth ‘plus’ function, so one can consider either generalized Hessian, or smooth approximation function to replace the ‘plus’ function and further apply the simple Newton-Armijo step size algorithm. These algorithms can be easily implemented in MATLAB and do not require any optimization toolbox. The advantage of this method is that proposed algorithms take less training time and can deal with data having heteroscedastic noise structure. To demonstrate the effectiveness of the proposed method, computational results are obtained on synthetic and real-world datasets which clearly show comparable generalization performance and improved learning speed in accordance with support vector regression, twin support vector regression, and twin parametric insensitive support vector regression.

中文翻译:

通过无约束最小化问题的高效隐式拉格朗日孪生参数不敏感支持向量回归

在本文中,提出了一种有效的隐式拉格朗日孪生参数不敏感支持向量回归,该回归导致一对无约束最小化问题,受孪生参数不敏感支持向量回归工作的启发 (Peng: Neurocomputing. 79, 26–38, 2012)和拉格朗日孪生支持向量回归(Balasundaram 和 Tanveer:Neural Comput. Applic. 22(1), 257–267, 2013)。由于其目标函数具有强凸性、分段二次性和可微性,因此可以通过基于梯度的迭代方法求解。请注意,它的目标函数具有非平滑的“plus”函数,因此可以考虑使用广义 Hessian 或平滑逼近函数来代替“plus”函数并进一步应用简单的 Newton-Armijo 步长算法。这些算法可以在 MATLAB 中轻松实现,不需要任何优化工具箱。这种方法的优点是所提出的算法需要较少的训练时间,并且可以处理具有异方差噪声结构的数据。为了证明所提出方法的有效性,根据支持向量回归、双支持向量回归和双参数不敏感支持向量回归,在合成和真实世界数据集上获得了计算结果,这些数据清楚地显示了可比的泛化性能和提高的学习速度。
更新日期:2020-11-19
down
wechat
bug