当前位置: X-MOL 学术Numer. Linear Algebra Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Least squares regression principal component analysis: A supervised dimensionality reduction method
Numerical Linear Algebra with Applications ( IF 1.8 ) Pub Date : 2021-09-13 , DOI: 10.1002/nla.2411
Hector Pascual 1 , Xin C. Yee 2
Affiliation  

Dimensionality reduction is an important technique in surrogate modeling and machine learning. In this article, we propose a supervised dimensionality reduction method, “least squares regression principal component analysis” (LSR-PCA), applicable to both classification and regression problems. To show the efficacy of this method, we present different examples in visualization, classification, and regression problems, comparing it with several state-of-the-art dimensionality reduction methods. Finally, we present a kernel version of LSR-PCA for problems where the inputs are correlated nonlinearly. The examples demonstrate that LSR-PCA can be a competitive dimensionality reduction method.

中文翻译:

最小二乘回归主成分分析:一种有监督的降维方法

降维是代理建模和机器学习中的一项重要技术。在本文中,我们提出了一种有监督的降维方法,“最小二乘回归主成分分析”(LSR-PCA),适用于分类和回归问题。为了展示这种方法的有效性,我们展示了可视化、分类和回归问题中的不同示例,并将其与几种最先进的降维方法进行了比较。最后,我们针对输入非线性相关的问题提出了 LSR-PCA 的内核版本。这些例子表明 LSR-PCA 可以是一种有竞争力的降维方法。
更新日期:2021-09-13
down
wechat
bug