当前位置: X-MOL 学术Mach. Learn. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Analysis of regularized least-squares in reproducing kernel Kreĭn spaces
Machine Learning ( IF 7.5 ) Pub Date : 2021-04-19 , DOI: 10.1007/s10994-021-05955-2
Fanghui Liu , Lei Shi , Xiaolin Huang , Jie Yang , Johan A. K. Suykens

In this paper, we study the asymptotic properties of regularized least squares with indefinite kernels in reproducing kernel Kreĭn spaces (RKKS). By introducing a bounded hyper-sphere constraint to such non-convex regularized risk minimization problem, we theoretically demonstrate that this problem has a globally optimal solution with a closed form on the sphere, which makes approximation analysis feasible in RKKS. Regarding to the original regularizer induced by the indefinite inner product, we modify traditional error decomposition techniques, prove convergence results for the introduced hypothesis error based on matrix perturbation theory, and derive learning rates of such regularized regression problem in RKKS. Under some conditions, the derived learning rates in RKKS are the same as that in reproducing kernel Hilbert spaces (RKHS). To the best of our knowledge, this is the first work on approximation analysis of regularized learning algorithms in RKKS.



中文翻译:

再现核Kreĭn空间中的正则化最小二乘分析

在本文中,我们研究了在复制核Kreĭn空间(RKKS)中具有不定核的正则化最小二乘的渐近性质。通过向此类非凸正则化风险最小化问题引入有界超球面约束,我们从理论上证明了该问题具有球面封闭形式的全局最优解,这使得在RKKS中进行近似分析变得可行。针对由不确定内积引起的原始正则化函数,我们修改了传统的误差分解技术,基于矩阵扰动理论证明了引入的假设误差的收敛结果,并导出了RKKS中此类正则化回归问题的学习率。在某些条件下,RKKS中的导出学习率与再现内核希尔伯特空间(RKHS)中的学习率相同。

更新日期:2021-04-19
down
wechat
bug