当前位置: X-MOL 学术Stat. Anal. Data Min. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Reduced Rank Ridge Regression and Its Kernel Extensions.
Statistical Analysis and Data Mining ( IF 1.3 ) Pub Date : 2011-11-07 , DOI: 10.1002/sam.10138
Ashin Mukherjee 1 , Ji Zhu
Affiliation  

In multivariate linear regression, it is often assumed that the response matrix is intrinsically of lower rank. This could be because of the correlation structure among the prediction variables or the coefficient matrix being lower rank. To accommodate both, we propose a reduced rank ridge regression for multivariate linear regression. Specifically, we combine the ridge penalty with the reduced rank constraint on the coefficient matrix to come up with a computationally straightforward algorithm. Numerical studies indicate that the proposed method consistently outperforms relevant competitors. A novel extension of the proposed method to the reproducing kernel Hilbert space (RKHS) set‐up is also developed. © 2011 Wiley Periodicals, Inc. Statistical Analysis and Data Mining 4: 612–622, 2011

中文翻译:

降低秩岭回归及其内核扩展。

在多元线性回归中,通常假设响应矩阵本质上是低秩的。这可能是因为预测变量之间的相关结构或系数矩阵较低。为了兼顾两者,我们为多元线性回归提出了降阶岭回归。具体来说,我们将岭惩罚与系数矩阵上的降低秩约束相结合,提出了一种计算简单的算法。数值研究表明,所提出的方法始终优于相关竞争对手。还开发了将所提出的方法扩展到再现内核希尔伯特空间(RKHS)设置的新扩展。© 2011 Wiley Periodicals, Inc. 统计分析和数据挖掘 4: 612–622, 2011
更新日期:2011-11-07
down
wechat
bug