当前位置: X-MOL 学术IEEE Trans. Pattern Anal. Mach. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Intrinsic Grassmann Averages for Online Linear, Robust and Nonlinear Subspace Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence ( IF 20.8 ) Pub Date : 5-4-2020 , DOI: 10.1109/tpami.2020.2992392
Rudrasis Chakraborty , Liu Yang , Soren Hauberg , Baba C Vemuri

Principal component analysis (PCA) and Kernel principal component analysis (KPCA) are fundamental methods in machine learning for dimensionality reduction. The former is a technique for finding this approximation in finite dimensions and the latter is often in an infinite dimensional reproducing Kernel Hilbert-space (RKHS). In this paper, we present a geometric framework for computing the principal linear subspaces in both (finite and infinite) situations as well as for the robust PCA case, that amounts to computing the intrinsic average on the space of all subspaces: the Grassmann manifold. Points on this manifold are defined as the subspaces spanned by K -tuples of observations. The intrinsic Grassmann average of these subspaces are shown to coincide with the principal components of the observations when they are drawn from a Gaussian distribution. We show similar results in the RKHS case and provide an efficient algorithm for computing the projection onto the this average subspace. The result is a method akin to KPCA which is substantially faster. Further, we present a novel online version of the KPCA using our geometric framework. Competitive performance of all our algorithms are demonstrated on a variety of real and synthetic data sets.

中文翻译:


在线线性、鲁棒和非线性子空间学习的内在格拉斯曼平均值



主成分分析(PCA)和核主成分分析(KPCA)是机器学习降维的基本方法。前者是一种在有限维中找到这种近似值的技术,而后者通常是在无限维再现核希尔伯特空间(RKHS)中。在本文中,我们提出了一个几何框架,用于计算(有限和无限)情况以及稳健 PCA 情况下的主线性子空间,这相当于计算所有子空间空间上的内在平均值:格拉斯曼流形。该流形上的点被定义为由 K 元组观测值跨越的子空间。当从高斯分布中提取这些子空间时,这些子空间的内在格拉斯曼平均值与观测值的主成分一致。我们在 RKHS 情况下显示了类似的结果,并提供了一种有效的算法来计算到该平均子空间的投影。结果是一种类似于 KPCA 的方法,速度要快得多。此外,我们使用我们的几何框架提出了 KPCA 的新颖在线版本。我们所有算法的竞争性能都在各种真实和合成数据集上得到了证明。
更新日期:2024-08-22
down
wechat
bug