当前位置: X-MOL 学术Appl. Comput. Harmon. Anal. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Learning with correntropy-induced losses for regression with mixture of symmetric stable noise
Applied and Computational Harmonic Analysis ( IF 2.6 ) Pub Date : 2019-09-04 , DOI: 10.1016/j.acha.2019.09.001
Yunlong Feng , Yiming Ying

In recent years, correntropy and its applications in machine learning have been drawing continuous attention owing to its merits in dealing with non-Gaussian noise and outliers. However, theoretical understanding of correntropy, especially in the learning theory context, is still limited. In this study, we investigate correntropy based regression in the presence of non-Gaussian noise or outliers within the statistical learning framework. Motivated by the practical way of generating non-Gaussian noise or outliers, we introduce mixture of symmetric stable noise, which include Gaussian noise, Cauchy noise, and their mixture as special cases, to model non-Gaussian noise or outliers. We demonstrate that under the mixture of symmetric stable noise assumption, correntropy based regression can learn the conditional mean function or the conditional median function well without resorting to the finite-variance or even the finite first-order moment condition on the noise. In particular, for the above two cases, we establish asymptotic optimal learning rates for correntropy based regression estimators that are asymptotically of type O(n1). These results justify the effectiveness of the correntropy based regression estimators in dealing with outliers as well as non-Gaussian noise. We believe that the present study makes a step forward towards understanding correntropy based regression from a statistical learning viewpoint, and may also shed some light on robust statistical learning for regression.



中文翻译:

借助熵引起的损失进行学习,以混合对称稳定噪声进行回归

近年来,由于其在处理非高斯噪声和离群值方面的优点,其在机器学习中的应用一直受到关注。但是,对熵的理论理解,尤其是在学习理论的背景下,仍然是有限的。在这项研究中,我们研究了在统计学习框架内存在非高斯噪声或离群值的情况下基于熵的回归。受生成非高斯噪声或离群值的实际方法的激励,我们引入对称稳定噪声的混合,其中包括高斯噪声,柯西噪声及其特殊情况的混合,以对非高斯噪声或离群值建模。我们证明了在混合对称稳定噪声假设下,基于熵的回归可以很好地学习条件均值函数或条件中值函数,而无需借助噪声的有限方差甚至有限一阶矩条件。特别是,对于上述两种情况,我们为基于渐进类型的渐近线回归估计量建立了渐近最优学习率Øñ-1个。这些结果证明了基于熵的回归估计器在处理异常值和非高斯噪声方面的有效性。我们认为,本研究从统计学习的角度向基于熵的回归理解迈进了一步,也可能为鲁棒的回归统计学习提供一些启示。

更新日期:2019-09-04
down
wechat
bug