当前位置: X-MOL 学术arXiv.cs.DS › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Scaling up Kernel Ridge Regression via Locality Sensitive Hashing
arXiv - CS - Data Structures and Algorithms Pub Date : 2020-03-21 , DOI: arxiv-2003.09756
Michael Kapralov, Navid Nouri, Ilya Razenshteyn, Ameya Velingker, Amir Zandieh

Random binning features, introduced in the seminal paper of Rahimi and Recht (2007), are an efficient method for approximating a kernel matrix using locality sensitive hashing. Random binning features provide a very simple and efficient way of approximating the Laplace kernel but unfortunately do not apply to many important classes of kernels, notably ones that generate smooth Gaussian processes, such as the Gaussian kernel and Matern kernel. In this paper, we introduce a simple weighted version of random binning features and show that the corresponding kernel function generates Gaussian processes of any desired smoothness. We show that our weighted random binning features provide a spectral approximation to the corresponding kernel matrix, leading to efficient algorithms for kernel ridge regression. Experiments on large scale regression datasets show that our method outperforms the accuracy of random Fourier features method.

中文翻译:

通过局部敏感哈希扩大核岭回归

在 Rahimi 和 Recht (2007) 的开创性论文中引入的随机分箱特征是一种使用局部敏感哈希近似内核矩阵的有效方法。随机分箱功能提供了一种非常简单有效的近似拉普拉斯核的方法,但不幸的是,它不适用于许多重要的核类别,尤其是生成平滑高斯过程的核,例如高斯核和 Matern 核。在本文中,我们介绍了随机分箱特征的简单加权版本,并表明相应的核函数生成任何所需平滑度的高斯过程。我们表明,我们的加权随机分箱特征提供了对应核矩阵的谱近似,从而为核岭回归提供了有效的算法。
更新日期:2020-03-24
down
wechat
bug