当前位置: X-MOL 学术Mach. Learn. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Linear support vector regression with linear constraints
Machine Learning ( IF 7.5 ) Pub Date : 2021-06-24 , DOI: 10.1007/s10994-021-06018-2
Quentin Klopfenstein , Samuel Vaiter

This paper studies the addition of linear constraints to the Support Vector Regression when the kernel is linear. Adding those constraints into the problem allows to add prior knowledge on the estimator obtained, such as finding positive vector, probability vector or monotone data. We prove that the related optimization problem stays a semi-definite quadratic problem. We also propose a generalization of the Sequential Minimal Optimization algorithm for solving the optimization problem with linear constraints and prove its convergence. We show that an efficient generalization of this iterative algorithm with closed-form updates can be used to obtain the solution of the underlying optimization problem. Then, practical performances of this estimator are shown on simulated and real datasets with different settings: non negative regression, regression onto the simplex for biomedical data and isotonic regression for weather forecast. These experiments show the usefulness of this estimator in comparison to more classical approaches.



中文翻译:

具有线性约束的线性支持向量回归

本文研究了在核为线性时向支持向量回归添加线性约束的问题。将这些约束添加到问题中允许在获得的估计器上添加先验知识,例如找到正向量、概率向量或单调数据。我们证明了相关的优化问题仍然是一个半定二次问题。我们还提出了用于解决具有线性约束的优化问题的序列最小优化算法的推广,并证明了其收敛性。我们表明,这种具有封闭形式更新的迭代算法的有效推广可用于获得基础优化问题的解决方案。然后,在具有不同设置的模拟和真实数据集上显示该估计器的实际性能:非负回归、生物医学数据的单纯形回归和天气预报的等渗回归。这些实验表明,与更经典的方法相比,这个估计器是有用的。

更新日期:2021-06-25
down
wechat
bug