当前位置: X-MOL 学术Appl. Comput. Harmon. Anal. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Kernel conjugate gradient methods with random projections
Applied and Computational Harmonic Analysis ( IF 2.6 ) Pub Date : 2021-05-21 , DOI: 10.1016/j.acha.2021.05.004
Junhong Lin , Volkan Cevher

We propose and study kernel conjugate gradient methods (KCGM) with random projections for least-squares regression over a separable Hilbert space. Considering two types of random projections generated by randomized sketches and Nyström subsampling, we prove optimal statistical results with respect to variants of norms for the algorithms under a suitable stopping rule. Particularly, our results show that if the projection dimension is proportional to the effective dimension of the problem, KCGM with randomized sketches can generalize optimally, while achieving a computational advantage. As a corollary, we derive optimal rates for classic KCGM in the well-conditioned regimes for the case that the target function may not be in the hypothesis space.



中文翻译:

具有随机投影的核共轭梯度方法

我们提出并研究了具有随机投影的核共轭梯度方法 (KCGM),用于在可分离的希尔伯特空间上进行最小二乘回归。考虑到由随机草图和 Nyström 子采样生成的两种类型的随机投影,我们证明了在合适的停止规则下关于算法的范数变体的最佳统计结果。特别是,我们的结果表明,如果投影维度与问题的有效维度成正比,则具有随机草图的 KCGM 可以最佳地泛化,同时实现计算优势。作为推论,对于目标函数可能不在假设空间中的情况,我们在条件良好的情况下推导出经典 KCGM 的最佳速率。

更新日期:2021-06-09
down
wechat
bug