当前位置: X-MOL 学术IEEE Trans. Neural Netw. Learn. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Multilayer Sparsity-Based Tensor Decomposition for Low-Rank Tensor Completion
IEEE Transactions on Neural Networks and Learning Systems ( IF 10.2 ) Pub Date : 2021-06-18 , DOI: 10.1109/tnnls.2021.3083931
Jize Xue 1 , Yongqiang Zhao 1 , Shaoguang Huang 2 , Wenzhi Liao 3 , Jonathan Cheung-Wai Chan 4 , Seong G. Kong 5
Affiliation  

Existing methods for tensor completion (TC) have limited ability for characterizing low-rank (LR) structures. To depict the complex hierarchical knowledge with implicit sparsity attributes hidden in a tensor, we propose a new multilayer sparsity-based tensor decomposition (MLSTD) for the low-rank tensor completion (LRTC). The method encodes the structured sparsity of a tensor by the multiple-layer representation. Specifically, we use the CANDECOMP/PARAFAC (CP) model to decompose a tensor into an ensemble of the sum of rank-1 tensors, and the number of rank-1 components is easily interpreted as the first-layer sparsity measure. Presumably, the factor matrices are smooth since local piecewise property exists in within-mode correlation. In subspace, the local smoothness can be regarded as the second-layer sparsity. To describe the refined structures of factor/subspace sparsity, we introduce a new sparsity insight of subspace smoothness: a self-adaptive low-rank matrix factorization (LRMF) scheme, called the third-layer sparsity. By the progressive description of the sparsity structure, we formulate an MLSTD model and embed it into the LRTC problem. Then, an effective alternating direction method of multipliers (ADMM) algorithm is designed for the MLSTD minimization problem. Various experiments in RGB images, hyperspectral images (HSIs), and videos substantiate that the proposed LRTC methods are superior to state-of-the-art methods.

中文翻译:


用于低阶张量补全的多层基于稀疏性的张量分解



现有的张量完成 (TC) 方法在表征低阶 (LR) 结构方面的能力有限。为了描述隐藏在张量中的隐式稀疏属性的复杂层次知识,我们提出了一种新的基于稀疏性的多层张量分解(MLSTD),用于低秩张量完成(LRTC)。该方法通过多层表示对张量的结构化稀疏性进行编码。具体来说,我们使用 CANDECOMP/PARAFAC (CP) 模型将张量分解为 1 阶张量之和的集合,并且 1 阶分量的数量很容易解释为第一层稀疏性度量。据推测,因子矩阵是平滑的,因为模式内相关性中存在局部分段特性。在子空间中,局部平滑度可以看作是第二层稀疏度。为了描述因子/子空间稀疏性的细化结构,我们引入了子空间平滑性的新稀疏性见解:自适应低秩矩阵分解(LRMF)方案,称为第三层稀疏性。通过稀疏结构的渐进描述,我们制定了MLSTD模型并将其嵌入到LRTC问题中。然后,针对MLSTD最小化问题,设计了一种有效的交替方向乘子法(ADMM)算法。 RGB 图像、高光谱图像 (HSI) 和视频的各种实验证实,所提出的 LRTC 方法优于最先进的方法。
更新日期:2021-06-18
down
wechat
bug