当前位置: X-MOL 学术Appl. Comput. Harmon. Anal. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Distributed regularized least squares with flexible Gaussian kernels
Applied and Computational Harmonic Analysis ( IF 2.5 ) Pub Date : 2021-03-29 , DOI: 10.1016/j.acha.2021.03.008
Ting Hu , Ding-Xuan Zhou

We propose a distributed learning algorithm for least squares regression in reproducing kernel Hilbert spaces (RKHSs) generated by flexible Gaussian kernels, based on a divide-and-conquer strategy. Our study demonstrates that Gaussian kernels with flexible variances greatly improve the learning performance of distributed algorithms generated by a fixed Gaussian. Under some mild conditions, we establish sharp error bounds for the distributed algorithm with labeled data in which the variance of the Gaussian kernel serves as a tuning parameter. We show that with suitably chosen parameters our error rates can be almost mini-max optimal under the standard Sobolev smoothness condition on the target function. By utilizing additional information of unlabeled data for semi-supervised learning, we relax the restrictions on the number of data partition and the range of the Sobolev smoothness index.



中文翻译:

具有弹性高斯核的分布正则化最小二乘

我们提出了一种基于分而治之策略的最小二乘回归的分布式学习算法,该算法用于再现由柔性高斯内核生成的内核希尔伯特空间(RKHS)。我们的研究表明,具有可变方差的高斯核极大地提高了固定高斯生成的分布式算法的学习性能。在某些温和条件下,我们为带有标记数据的分布式算法建立了清晰的误差范围,其中高斯核的方差用作调整参数。我们表明,适当选择的参数我们的错误率可以是标准的索伯列夫平滑的目标函数条件下几乎迷你最大优化。通过将未标记数据的其他信息用于半监督学习,

更新日期:2021-03-29
down
wechat
bug