当前位置: X-MOL 学术SIAM Rev. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Generalized Canonical Polyadic Tensor Decomposition
SIAM Review ( IF 10.2 ) Pub Date : 2020-02-11 , DOI: 10.1137/18m1203626
David Hong , Tamara G. Kolda , Jed A. Duersch

SIAM Review, Volume 62, Issue 1, Page 133-163, January 2020.
Tensor decomposition is a fundamental unsupervised machine learning method in data science, with applications including network analysis and sensor data processing. This work develops a generalized canonical polyadic (GCP) low-rank tensor decomposition that allows other loss functions besides squared error. For instance, we can use logistic loss or Kullback--Leibler divergence, enabling tensor decomposition for binary or count data. We present a variety of statistically motivated loss functions for various scenarios. We provide a generalized framework for computing gradients and handling missing data that enables the use of standard optimization methods for fitting the model. We demonstrate the flexibility of the GCP decomposition on several real-world examples including interactions in a social network, neural activity in a mouse, and monthly rainfall measurements in India.


中文翻译:

广义规范多张量张量分解

SIAM评论,第62卷,第1期,第133-163页,2020年1月。
张量分解是数据科学中一种基本的无监督机器学习方法,其应用包括网络分析和传感器数据处理。这项工作开发了一种通用的规范多态(GCP)低秩张量分解,除平方误差外,还允许其他损失函数。例如,我们可以使用逻辑损失或Kullback-Leibler散度,对二进制或计数数据进行张量分解。我们为各种情况提供了各种基于统计的损失函数。我们提供了用于计算梯度和处理缺失数据的通用框架,该框架允许使用标准的优化方法来拟合模型。我们在几个实际示例中展示了GCP分解的灵活性,这些示例包括社交网络中的互动,鼠标的神经活动,
更新日期:2020-02-11
down
wechat
bug