当前位置: X-MOL 学术IEEE Trans. Comput. Imaging › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Enhanced Nonconvex Low-Rank Approximation of Tensor Multi-Modes for Tensor Completion
IEEE Transactions on Computational Imaging ( IF 5.4 ) Pub Date : 2021-01-22 , DOI: 10.1109/tci.2021.3053699
Haijin Zeng , Yongyong Chen , Xiaozhen Xie , Jifeng Ning

Higher-order low-rank tensor arises in many data processing applications and has attracted great interests. Inspired by low-rank approximation theory, researchers have proposed a series of effective tensor completion methods. However, most of these methods directly consider the global low-rankness of underlying tensors, which is not sufficient for a low sampling rate; in addition, the single nuclear norm or its relaxation is usually adopted to approximate the rank function, which would lead to suboptimal solution deviated from the original one. To alleviate the above problems, in this paper, we propose a novel low-rank approximation of tensor multi-modes (LRATM), in which a double nonconvex $L_{\gamma }$ norm is designed to represent the underlying joint-manifold drawn from the factorization factors of each mode in the underlying tensor. A block successive upper-bound minimization method-based algorithm is designed to efficiently solve the proposed model, and it can be demonstrated that our numerical scheme converges to the coordinatewise minimizers. Numerical results on three types of public multi-dimensional datasets have tested and shown that our algorithm can recover a variety of low-rank tensors with significantly fewer samples than the compared methods.

中文翻译:

张量多模态的增强非凸低秩逼近以完成张量

高阶低秩张量出现在许多数据处理应用中,并引起了极大的兴趣。受低秩逼近理论的启发,研究人员提出了一系列有效的张量完成方法。然而,这些方法大多数都直接考虑了底层张量的全局低秩,这不足以实现低采样率。另外,通常采用单核范数或它的弛豫来近似秩函数,这将导致次优解与原始解不符。为了缓解上述问题,在本文中,我们提出了一种新颖的张量多模(LRATM)的低秩逼近,其中双不凸$ L _ {\ gamma} $范数旨在表示从基础张量中每种模式的因式分解因子得出的基础联合流形。设计了一种基于块连续上限最小化方法的算法来有效地求解所提出的模型,并且可以证明我们的数值方案收敛于坐标最小化器。对三种类型的公共多维数据集的数值结果进行了测试,结果表明,与比较方法相比,我们的算法可以用更少的样本恢复各种低秩张量。
更新日期:2021-02-12
down
wechat
bug