当前位置: X-MOL 学术SIAM J. Matrix Anal. Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Estimating Higher-Order Moments Using Symmetric Tensor Decomposition
SIAM Journal on Matrix Analysis and Applications ( IF 1.5 ) Pub Date : 2020-01-01 , DOI: 10.1137/19m1299633
Samantha Sherman , Tamara G. Kolda

We consider the problem of decomposing higher-order moment tensors, i.e., the sum of symmetric outer products of data vectors. Such a decomposition can be used to estimate the means in a Gaussian mixture model and for other applications in machine learning. The $d$th-order empirical moment tensor of a set of $p$ observations of $n$ variables is a symmetric $d$-way tensor. Our goal is to find a low-rank tensor approximation comprising $r \ll p$ symmetric outer products. The challenge is that forming the empirical moment tensors costs $O(pn^d)$ operations and $O(n^d)$ storage, which may be prohibitively expensive; additionally, the algorithm to compute the low-rank approximation costs $O(n^d)$ per iteration. Our contribution is avoiding formation of the moment tensor, computing the low-rank tensor approximation of the moment tensor implicitly using $O(pnr)$ operations per iteration and no extra memory. This advance opens the door to more applications of higher-order moments since they can now be efficiently computed. We present numerical evidence of the computational savings and show an example of estimating the means for higher-order moments.

中文翻译:

使用对称张量分解估计高阶矩

我们考虑分解高阶矩张量的问题,即数据向量的对称外积之和。这种分解可用于估计高斯混合模型中的均值以及机器学习中的其他应用。一组 $n$ 变量的 $p$ 观测值的 $d$th 阶经验矩张量是一个对称的 $d$-way 张量。我们的目标是找到一个包含 $r \ll p$ 对称外积的低秩张量近似。挑战在于,形成经验矩张量需要花费 $O(pn^d)$ 操作和 $O(n^d)$ 存储,这可能非常昂贵;此外,计算低秩近似的算法每次迭代的成本为 $O(n^d)$。我们的贡献是避免了矩张量的形成,每次迭代使用 $O(pnr)$ 操作隐式计算矩张量的低秩张量近似值,无需额外内存。这一进步为高阶矩的更多应用打开了大门,因为它们现在可以被有效地计算。我们提供了计算节省的数值证据,并展示了一个估计高阶矩均值的例子。
更新日期:2020-01-01
down
wechat
bug