当前位置: X-MOL 学术Numer. Linear Algebra Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Robust tensor train component analysis
Numerical Linear Algebra with Applications ( IF 4.3 ) Pub Date : 2021-07-13 , DOI: 10.1002/nla.2403
Xiongjun Zhang 1 , Michael K. Ng 2
Affiliation  

Robust Principal Component Analysis plays a key role in various fields such as image and video processing, data mining, and hyperspectral data analysis. In this paper, we study the problem of robust tensor train (TT) principal component analysis from partial observations, which aims to decompose a given tensor into the low TT rank and sparse components. The decomposition of the proposed model is used to find the hidden factors and help alleviate the curse of dimensionality via a set of connected low-rank tensors. A relaxation model is to minimize a weighted combination of the sum of nuclear norms of unfolding matrices of core tensors and the tensor â„“ 1 norm. A proximal alternating direction method of multipliers is developed to solve the resulting model. Furthermore, we show that any cluster point of the convergent subsequence is a Karush-Kuhn-Tucker point of the proposed model under some conditions. Extensive numerical examples on both synthetic data and real-world datasets are presented to demonstrate the effectiveness of the proposed approach.

中文翻译:

稳健的张量序列分量分析

鲁棒主成分分析在图像和视频处理、数据挖掘和高光谱数据分析等各个领域都发挥着关键作用。在本文中,我们从部分观察研究了稳健张量训练 (TT) 主成分分析问题,旨在将给定的张量分解为低 TT 秩和稀疏分量。所提出模型的分解用于找到隐藏因素,并通过一组连接的低秩张量帮助缓解维数灾难。松弛模型是最小化核心张量和张量的展开矩阵的核范数之和的加权组合 一个”” 1 规范。开发了乘法器的近端交替方向方法来求解所得模型。此外,我们表明在某些条件下,收敛子序列的任何簇点都是所提出模型的 Karush-Kuhn-Tucker 点。提供了关于合成数据和现实世界数据集的大量数值示例,以证明所提出方法的有效性。
更新日期:2021-07-13
down
wechat
bug