当前位置: X-MOL 学术Appl. Comput. Harmon. Anal. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Optimal rates for spectral algorithms with least-squares regression over Hilbert spaces
Applied and Computational Harmonic Analysis ( IF 2.6 ) Pub Date : 2018-10-04 , DOI: 10.1016/j.acha.2018.09.009
Junhong Lin , Alessandro Rudi , Lorenzo Rosasco , Volkan Cevher

In this paper, we study regression problems over a separable Hilbert space with the square loss, covering non-parametric regression over a reproducing kernel Hilbert space. We investigate a class of spectral/regularized algorithms, including ridge regression, principal component regression, and gradient methods. We prove optimal, high-probability convergence results in terms of variants of norms for the studied algorithms, considering a capacity assumption on the hypothesis space and a general source condition on the target function. Consequently, we obtain almost sure convergence results with optimal rates. Our results improve and generalize previous results, filling a theoretical gap for the non-attainable cases.



中文翻译:

Hilbert空间上具有最小二乘回归的频谱算法的最优速率

在本文中,我们研究具有平方损失的可分离希尔伯特空间上的回归问题,涵盖了再生核希尔伯特空间上的非参数回归。我们研究了一类频谱/正则化算法,包括岭回归,主成分回归和梯度方法。考虑到假设空间上的容量假设和目标函数的一般源条件,我们针对所研究算法的规范变体证明了最佳的高概率收敛结果。因此,我们可以获得最佳速率的几乎确定的收敛结果。我们的结果改进并概括了以前的结果,填补了无法达到的情况的理论空白。

更新日期:2018-10-04
down
wechat
bug