当前位置: X-MOL 学术IEEE Signal Proc. Mag. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Accelerating Tensor Contraction Products via Tensor-Train Decomposition [Tips & Tricks]
IEEE Signal Processing Magazine ( IF 14.9 ) Pub Date : 2022-08-29 , DOI: 10.1109/msp.2022.3156744
Ilya Kisil 1 , Giuseppe G. Calvi 2 , Kriton Konstantinidis 3 , Yao Lei Xu 3 , Danilo P. Mandic 3
Affiliation  

Tensors (multiway arrays) and tensor decompositions (TDs) have recently received tremendous attention in the data analytics community, due to their ability to mitigate the curse of dimensionality associated with modern large-dimensional big data [1] , [2] . Indeed, TDs allow for data volume (e.g., the parameter complexity) to be reduced from scaling exponentially to scaling linearly in the tensor dimensions, which facilitates applications in areas including the compression and interpretability of neural networks [1] , [3] , multimodal learning [1] , and completion of knowledge graphs [4] , [5] . At the heart of TD techniques is the tensor contraction product (TCP), an operator used for representing even the most unmanageable higher-order tensors through a set of small-scale core tensors that are interconnected via TCP operations [2] .

中文翻译:

通过张量训练分解加速张量收缩产品 [提示与技巧]

张量(多路数组)和张量分解(TD)最近在数据分析界受到了极大的关注,因为它们能够减轻与现代大维大数据相关的维数灾难[1] ,[2] . 实际上,TD 允许数据量(例如,参数复杂度)从张量维度的指数缩放减少到线性缩放,这有助于在神经网络的压缩和可解释性等领域的应用[1] ,[3] , 多模态学习[1] ,并完成知识图谱[4] ,[5] . TD 技术的核心是张量收缩积 (TC​​P),该算子用于通过一组通过 TCP 操作互连的小规模核心张量来表示最难以管理的高阶张量[2] .
更新日期:2022-09-03
down
wechat
bug