当前位置: X-MOL 学术Knowl. Inf. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Kernel-based regression via a novel robust loss function and iteratively reweighted least squares
Knowledge and Information Systems ( IF 2.7 ) Pub Date : 2021-03-20 , DOI: 10.1007/s10115-021-01554-8
Hongwei Dong , Liming Yang

Least squares kernel-based methods have been widely used in regression problems due to the simple implementation and good generalization performance. Among them, least squares support vector regression (LS-SVR) and extreme learning machine (ELM) are popular techniques. However, the noise sensitivity is a major bottleneck. To address this issue, a generalized loss function, called \(\ell _s\)-loss, is proposed in this paper. With the support of novel loss function, two kernel-based regressors are constructed by replacing the \(\ell _2\)-loss in LS-SVR and ELM with the proposed \(\ell _s\)-loss for better noise robustness. Important properties of \(\ell _s\)-loss, including robustness, asymmetry and asymptotic approximation behaviors, are verified theoretically. Moreover, iteratively reweighted least squares are utilized to optimize and interpret the proposed methods from a weighted viewpoint. The convergence of the proposal is proved, and detailed analyses of robustness are given. Experiments on both artificial and benchmark datasets confirm the validity of the proposed methods.



中文翻译:

通过新颖的鲁棒损失函数和迭代加权最小二乘法进行基于核的回归

基于最小二乘核的方法由于实现简单且具有良好的泛化性能而被广泛用于回归问题。其中,最小二乘支持向量回归(LS-SVR)和极限学习机(ELM)是流行的技术。但是,噪声敏感度是主要瓶颈。为了解决这个问题,本文提出了一个广义的损失函数,称为\(\ ell _s \)-损失。与支撑新颖损失函数的两个基于内核的回归量通过替换构造的\(\ ELL _2 \) -Loss在LS-SVR和ELM与所提出的\(\ ELL _s \) -Loss用于更好的噪声鲁棒性。\(\ ell _s \)的重要属性理论上验证了包括鲁棒性,不对称性和渐近逼近行为在内的损耗。此外,从加权的观点出发,迭代地加权最小二乘被用来优化和解释所提出的方法。证明了该建议的收敛性,并给出了鲁棒性的详细分析。在人工数据集和基准数据集上的实验证实了所提方法的有效性。

更新日期:2021-03-21
down
wechat
bug