当前位置: X-MOL 学术IEEE Trans. Cybern. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Generalized Learning Vector Quantization With Log-Euclidean Metric Learning on Symmetric Positive-Definite Manifold
IEEE Transactions on Cybernetics ( IF 11.8 ) Pub Date : 2022-06-14 , DOI: 10.1109/tcyb.2022.3178412
Fengzhen Tang 1 , Peter Tino 2 , Haibin Yu 1
Affiliation  

In many classification scenarios, the data to be analyzed can be naturally represented as points living on the curved Riemannian manifold of symmetric positive-definite (SPD) matrices. Due to its non-Euclidean geometry, usual Euclidean learning algorithms may deliver poor performance on such data. We propose a principled reformulation of the successful Euclidean generalized learning vector quantization (GLVQ) methodology to deal with such data, accounting for the nonlinear Riemannian geometry of the manifold through log-Euclidean metric (LEM). We first generalize GLVQ to the manifold of SPD matrices by exploiting the LEM-induced geodesic distance (GLVQ-LEM). We then extend GLVQ-LEM with metric learning. In particular, we study both 1) a more straightforward implementation of the metric learning idea by adapting metric in the space of vectorized log-transformed SPD matrices and 2) the full formulation of metric learning without matrix vectorization, thus preserving the second-order tensor structure. To obtain the distance metric in the full LEM learning (LEML) approaches, two algorithms are proposed. One method is to restrict the distance metric to be full rank, treating the distance metric tensor as an SPD matrix, and readily use the LEM framework (GLVQ-LEML-LEM). The other method is to cast no such restriction, treating the distance metric tensor as a fixed rank positive semidefinite matrix living on a quotient manifold with total space equipped with flat geometry (GLVQ-LEML-FM). Experiments on multiple datasets of different natures demonstrate the good performance of the proposed methods.

中文翻译:

对称正定流形上的对数欧氏度量学习的广义学习矢量量化

在许多分类场景中,要分析的数据可以自然地表示为位于对称正定 (SPD) 矩阵的弯曲黎曼流形上的点。由于其非欧几里德几何,通常的欧几里德学习算法可能在此类数据上表现不佳。我们提出了对成功的欧几里德广义学习矢量量化(GLVQ)方法的原则性重新表述来处理此类数据,通过对数欧几里德度量(LEM)解释流形的非线性黎曼几何。我们首先利用 LEM 诱导的测地距离 (GLVQ-LEM) 将 GLVQ 推广到 SPD 矩阵的流形。然后,我们通过度量学习扩展 GLVQ-LEM。特别是,我们研究了 1)通过在向量化对数变换 SPD 矩阵空间中调整度量来更直接地实现度量学习思想,以及 2)无需矩阵向量化的度量学习的完整公式,从而保留二阶张量结构。为了获得完整 LEM 学习 (LEML) 方法中的距离度量,提出了两种算法。一种方法是将距离度量限制为满秩,将距离度量张量视为 SPD 矩阵,并轻松使用 LEM 框架 (GLVQ-LEML-LEM)。另一种方法是不施加这样的限制,将距离度量张量视为存在于总空间配备平面几何的商流形上的固定秩正半定矩阵(GLVQ-LEML-FM)。对不同性质的多个数据集的实验证明了所提出的方法的良好性能。
更新日期:2022-06-14
down
wechat
bug