当前位置: X-MOL 学术Appl. Soft Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A class of new Support Vector Regression models
Applied Soft Computing ( IF 7.2 ) Pub Date : 2020-06-11 , DOI: 10.1016/j.asoc.2020.106446
Pritam Anand , Reshma Rastogi , Suresh Chandra

We propose a novel convex loss function termed as ‘ϵ-penalty loss function’, to be used in Support Vector Regression (SVR) model. The proposed ϵ-penalty loss function is shown to be optimal for a more general noise distribution. The popular ϵ-insensitive loss function and the Laplace loss function are particular cases of the proposed loss function. Making the use of the proposed loss function, we have proposed two new Support Vector Regression models in this paper. The first model which we have termed with ‘ϵ-Penalty Support Vector Regression’ (ϵ-PSVR) model minimizes the proposed loss function with L2-norm regularization. The second model minimizes the proposed loss function with L1-Norm regularization and has been termed as ‘L1-Norm Penalty Support Vector Regression’ (L1- Norm PSVR) model. The proposed loss function can offer different rates of penalization inside and outside of the ϵ-tube. This strategy enables the proposed SVR models to use the full information of the training set which make them to generalize well. Further, the numerical results obtained from the experiments carried out on various artificial, benchmark datasets and financial time series datasets show that the proposed SVR models own better generalization ability than existing SVR models.



中文翻译:

一类新的支持向量回归模型

我们提出了一种新颖的凸损失函数,称为“ϵ-惩罚损失函数”,将在支持向量回归(SVR)模型中使用。建议ϵ惩罚损失函数对于更一般的噪声分布显示为最佳。受欢迎的ϵ-不敏感损失函数和拉普拉斯损失函数是所提出损失函数的特殊情况。利用提出的损失函数,我们在本文中提出了两个新的支持向量回归模型。我们用“ϵ-惩罚支持向量回归'(ϵ-PSVR)模型将建议的损失函数最小化 大号2-规范正则化。第二个模型使用大号1个-Norm正则化,并被称为“大号1个-范数惩罚支持向量回归'(大号1个-标准PSVR)模型。拟议的损失函数可以在内部和外部提供不同的惩罚率ϵ-管。这种策略使所提出的SVR模型能够使用训练集的全部信息,从而使它们具有很好的概括性。此外,从在各种人工基准数据集和金融时间序列数据集上进行的实验获得的数值结果表明,所提出的SVR模型具有比现有SVR模型更好的泛化能力。

更新日期:2020-06-11
down
wechat
bug