当前位置: X-MOL 学术Numer. Linear Algebra Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Symmetric tensor decomposition by alternating gradient descent
Numerical Linear Algebra with Applications ( IF 1.8 ) Pub Date : 2021-08-07 , DOI: 10.1002/nla.2406
Haixia Liu 1, 2
Affiliation  

The symmetric tensor decomposition problem is a fundamental problem in many fields, which appealing for investigation. In general, greedy algorithm is used for tensor decomposition. That is, we first find the largest singular value and singular vector and subtract the corresponding component from tensor, then repeat the process. In this article, we focus on designing one effective algorithm and giving its convergence analysis. We introduce an exceedingly simple and fast algorithm for rank-one approximation of symmetric tensor decomposition. Throughout variable splitting, we solve symmetric tensor decomposition problem by minimizing a multiconvex optimization problem. We use alternating gradient descent algorithm to solve. Although we focus on symmetric tensors in this article, the method can be extended to nonsymmetric tensors in some cases. Additionally, we also give some theoretical analysis about our alternating gradient descent algorithm. We prove that alternating gradient descent algorithm converges linearly to global minimizer. We also provide numerical results to show the effectiveness of the algorithm.

中文翻译:

通过交替梯度下降的对称张量分解

对称张量分解问题是许多领域的基础性问题,需要深入研究。通常,贪心算法用于张量分解。也就是说,我们首先找到最大的奇异值和奇异向量,然后从张量中减去相应的分量,然后重复这个过程。在本文中,我们专注于设计一种有效的算法并给出其收敛性分析。我们为对称张量分解的秩一近似引入了一种非常简单和快速的算法。在整个变量分裂过程中,我们通过最小化多凸优化问题来解决对称张量分解问题。我们使用交替梯度下降算法来解决。尽管我们在本文中专注于对称张量,但在某些情况下,该方法可以扩展到非对称张量。此外,我们还对我们的交替梯度下降算法进行了一些理论分析。我们证明了交替梯度下降算法线性收敛到全局极小值。我们还提供了数值结果来展示算法的有效性。
更新日期:2021-08-07
down
wechat
bug