当前位置: X-MOL 学术Numer. Linear Algebra Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Two-level preconditioning for Ridge Regression
Numerical Linear Algebra with Applications ( IF 4.3 ) Pub Date : 2021-03-11 , DOI: 10.1002/nla.2371
Joris Tavernier 1, 2 , Jaak Simm 2 , Karl Meerbergen 1 , Yves Moreau 2
Affiliation  

Solving linear systems is often the computational bottleneck in real-life problems. Iterative solvers are the only option due to the complexity of direct algorithms or because the system matrix is not explicitly known. Here, we develop a two-level preconditioner for regularized least squares linear systems involving a feature or data matrix. Variants of this linear system may appear in machine learning applications, such as ridge regression, logistic regression, support vector machines and Bayesian regression. We use clustering algorithms to create a coarser level that preserves the principal components of the covariance or Gram matrix. This coarser level approximates the dominant eigenvectors and is used to build a subspace preconditioner accelerating the Conjugate Gradient method. We observed speed-ups for artificial and real-life data.

中文翻译:

岭回归的两级预处理

解决线性系统通常是现实生活中的计算瓶颈。由于直接算法的复杂性或系统矩阵不明确,迭代求解器是唯一的选择。在这里,我们为涉及特征或数据矩阵的正则化最小二乘线性系统开发了一个两级预处理器。这种线性系统的变体可能会出现在机器学习应用中,例如岭回归、逻辑回归、支持向量机和贝叶斯回归。我们使用聚类算法来创建一个更粗略的级别,以保留协方差或 Gram 矩阵的主成分。此较粗略的级别近似于主要特征向量,并用于构建加速共轭梯度方法的子空间预处理器。我们观察到人工和现实生活数据的加速。
更新日期:2021-03-11
down
wechat
bug