当前位置: X-MOL 学术Neural Process Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
On Regularization Based Twin Support Vector Regression with Huber Loss
Neural Processing Letters ( IF 3.1 ) Pub Date : 2021-01-03 , DOI: 10.1007/s11063-020-10380-y
Umesh Gupta 1 , Deepak Gupta 1
Affiliation  

Twin support vector regression (TSVR) is generally employed with \( \varepsilon \)-insensitive loss function which is not well capable to handle the noises and outliers. According to the definition, Huber loss function performs as quadratic for small errors and linear for others and shows better performance in comparison to Gaussian loss hence it restrains easily for a different type of noises and outliers. Recently, TSVR with Huber loss (HN-TSVR) has been suggested to handle the noise and outliers. Like TSVR, it is also having the singularity problem which degrades the performance of the model. In this paper, regularized version of HN-TSVR is proposed as regularization based twin support vector regression (RHN-TSVR) to avoid the singularity problem of HN-TSVR by applying the structured risk minimization principle that leads to our model convex and well-posed. This proposed RHN-TSVR model is well capable to handle the noise as well as outliers and avoids the singularity issue. To show the validity and applicability of proposed RHN-TSVR, various experiments perform on several artificial generated datasets having uniform, Gaussian and Laplacian noise as well as on benchmark different real-world datasets and compare with support vector regression, TSVR, \( \varepsilon \)-asymmetric Huber SVR, \( \varepsilon \)-support vector quantile regression and HN-TSVR. Here, all benchmark real-world datasets are embedded with a different significant level of noise 0%, 5% and 10% on different reported algorithms with the proposed approach. The proposed algorithm RHN-TSVR is showing better prediction ability on artificial datasets as well as real-world datasets with a different significant level of noise compared to other reported models.



中文翻译:

基于正则化的带有 Huber 损失的双支持向量回归

双支持向量回归 (TSVR) 通常与\( \varepsilon \)- 不敏感的损失函数,不能很好地处理噪声和异常值。根据定义,Huber 损失函数对于小误差表现为二次函数,对于其他函数表现为线性函数,与高斯损失函数相比表现出更好的性能,因此它很容易抑制不同类型的噪声和异常值。最近,有人建议使用具有 Huber 损失的 TSVR (HN-TSVR) 来处理噪声和异常值。与 TSVR 一样,它也存在会降低模型性能的奇点问题。在本文中,提出了 HN-TSVR 的正则化版本作为基于正则化的双支持向量回归 (RHN-TSVR),通过应用结构化风险最小化原则避免 HN-TSVR 的奇异性问题,该原则导致我们的模型凸且适定. 所提出的 RHN-TSVR 模型能够很好地处理噪声和异常值,并避免奇点问题。为了证明所提出的 RHN-TSVR 的有效性和适用性,对具有均匀、高斯和拉普拉斯噪声的几个人工生成的数据集以及基准不同的真实世界数据集进行了各种实验,并与支持向量回归、TSVR、\( \varepsilon \) -非对称 Huber SVR,\( \varepsilon \) - 支持向量分位数回归和 HN-TSVR。在这里,所有基准真实世界数据集都嵌入了不同显着水平的噪声 0%、5% 和 10%,使用所提出的方法在不同的报告算法上。与其他报告的模型相比,所提出的算法 RHN-TSVR 在人工数据集和具有不同显着噪声水平的真实世界数据集上显示出更好的预测能力。

更新日期:2021-01-03
down
wechat
bug