当前位置: X-MOL 学术Ann. Math. Artif. Intel. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Approximate kernel partial least squares
Annals of Mathematics and Artificial Intelligence ( IF 1.2 ) Pub Date : 2020-03-27 , DOI: 10.1007/s10472-020-09694-3
Xiling Liu , Shuisheng Zhou

As an extension of partial least squares (PLS), kernel partial least squares (KPLS) is an very important methods to find nonlinear patterns from data. However, the application of KPLS to large-scale problems remains a big challenge, due to storage and computation issues in the number of examples. To address this limitation, we utilize randomness to design scalable new variants of the kernel matrix to solve KPLS. Specifically, we consider the spectral properties of low-rank kernel matrices constructed as sums of random feature dot-products and present a new method called randomized kernel partial least squares (RKPLS) to approximate KPLS. RKPLS can alleviate the computation requirements of approximate KPLS with linear space and computation in the sample size. Theoretical analysis and experimental results show that the solution of our algorithm converges to exact kernel matrix in expectation.

中文翻译:

近似核偏最小二乘

作为偏最小二乘法 (PLS) 的扩展,核偏最小二乘法 (KPLS) 是从数据中寻找非线性模式的一种非常重要的方法。然而,由于示例数量的存储和计算问题,KPLS 在大规模问题中的应用仍然是一个很大的挑战。为了解决这个限制,我们利用随机性来设计内核矩阵的可扩展新变体来解决 KPLS。具体来说,我们考虑了构造为随机特征点积之和的低秩核矩阵的频谱特性,并提出了一种称为随机核偏最小二乘法 (RKPLS) 的新方法来近似 KPLS。RKPLS 可以减轻线性空间和样本大小计算的近似 KPLS 的计算要求。
更新日期:2020-03-27
down
wechat
bug