当前位置: X-MOL 学术Constr. Approx. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Unregularized Online Algorithms with Varying Gaussians
Constructive Approximation ( IF 2.7 ) Pub Date : 2021-04-07 , DOI: 10.1007/s00365-021-09536-3
Baobin Wang , Ting Hu

Gaussians are a family of Mercer kernels which are widely used in machine learning and statistics. The variance of a Gaussian kernel reflexes the specific structure of the reproducing kernel Hilbert spaces (RKHS) induced by the Gaussian or other important features of learning problems such as the frequency of function components. As the variance of the Gaussian decreases, the learning performance and approximation ability will improve. This paper introduces the unregularized online algorithm with decreasing Gaussians where no regularization term is imposed and the samples are presented in sequence. With the appropriate step sizes, concrete learning rates are derived under the smoothness assumptions on the target function, for which are used to bound the approximation error. Additionally, a new type of the geometric noise condition is proposed to estimate the approximation error instead of any smoothness assumption. It is more general than the work in Steinwart et al. (Ann Stat 35(2):575–607, 2007), for which is only suitable for the hinge loss. An essential estimate is to bound the difference of the approximation functions generated by varying Gaussian RKHS. Fourier transform plays a crucial role in our analysis.



中文翻译:

具有变化高斯的非正规在线算法

高斯人是Mercer内核家族,广泛用于机器学习和统计。高斯核的方差反映了由高斯或其他学习问题的重要特征(例如功能成分的频率)引起的再生核希尔伯特空间(RKHS)的特定结构。随着高斯方差的减小,学习性能和逼近能力将提高。本文介绍了具有递减的高斯分布的不规则在线算法,其中不施加正则项,并且按顺序展示样本。通过适当的步长,可以在目标函数的平滑度假设下得出具体的学习率,并用其来限制近似误差。此外,提出了一种新型的几何噪声条件来估计近似误差,而不是任何平滑度假设。它比Steinwart等人的著作更为笼统。(Ann Stat 35(2):575–607,2007),仅适用于铰链损失。一个基本的估计是限制由变化的高斯RKHS生成的近似函数的差。傅里叶变换在我们的分析中起着至关重要的作用。

更新日期:2021-04-08
down
wechat
bug