当前位置: X-MOL 学术Pattern Recogn. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Parametric PCA for unsupervised metric learning
Pattern Recognition Letters ( IF 5.1 ) Pub Date : 2020-05-16 , DOI: 10.1016/j.patrec.2020.05.011
Alexandre L.M. Levada

In pattern recognition, the problem of quantifying a suitable similarity measure between different objects in a collection is a challenging task, especially in cases where the standard Euclidean distance is not a reasonable choice. In this context, dimensionality reduction algorithms are powerful tools for unsupervised metric learning. In this paper, we propose a framework to build dimensionality reduction methods for unsupervised metric learning based on the mapping of local neighborhoods of the KNN graph to a parametric feature space, defined in terms of a statistical model. By incorporating a non-Euclidean metric based on the Bhattacharyya coefficient, we define the parametric kernel matrix, a surrogate for the covariance matrix of the parametric feature vectors. Inspired by PCA, we use the eigenvalues of the parametric kernel matrix to learn features for the original data. Numerical experiments with real datasets show that Parametric PCA is capable of producing better defined clusters and also more discriminant features in comparison to regular PCA, kernel PCA, sparse PCA, robust PCA and manifold learning algorithms, making the proposed method a promising alternative for unsupervised metric learning.



中文翻译:

用于无监督度量学习的参数PCA

在模式识别中,量化集合中不同对象之间合适的相似性度量的问题是一项艰巨的任务,尤其是在标准欧几里德距离不是合理选择的情况下。在这种情况下,降维算法是用于无监督度量学习的强大工具。在本文中,我们提出了一个框架,该框架基于统计模型定义的KNN图的局部邻域到参数特征空间的映射,构建了用于无监督度量学习的降维方法。通过结合基于Bhattacharyya系数的非欧几里得度量,我们定义了参数核矩阵,这是参数特征向量的协方差矩阵的替代。受PCA的启发,我们使用参数核矩阵的特征值来学习原始数据的特征。真实数据集的数值实验表明,与常规PCA,内核PCA,稀疏PCA,鲁棒PCA和流形学习算法相比,参数PCA能够产生更好的定义簇,并且还具有更多判别特征,这使得该方法成为无监督度量的有希望的替代方法学习。

更新日期:2020-05-16
down
wechat
bug