当前位置: X-MOL 学术arXiv.cs.MS › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Automatic differentiation for Riemannian optimization on low-rank matrix and tensor-train manifolds
arXiv - CS - Mathematical Software Pub Date : 2021-03-27 , DOI: arxiv-2103.14974
Alexander Novikov, Maxim Rakhuba, Ivan Oseledets

In scientific computing and machine learning applications, matrices and more general multidimensional arrays (tensors) can often be approximated with the help of low-rank decompositions. Since matrices and tensors of fixed rank form smooth Riemannian manifolds, one of the popular tools for finding the low-rank approximations is to use the Riemannian optimization. Nevertheless, efficient implementation of Riemannian gradients and Hessians, required in Riemannian optimization algorithms, can be a nontrivial task in practice. Moreover, in some cases, analytic formulas are not even available. In this paper, we build upon automatic differentiation and propose a method that, given an implementation of the function to be minimized, efficiently computes Riemannian gradients and matrix-by-vector products between approximate Riemannian Hessian and a given vector.

中文翻译:

在低秩矩阵和张量-训练流形上进行黎曼优化的自动微分

在科学计算和机器学习应用中,通常可以借助低秩分解来近似矩阵和更通用的多维数组(张量)。由于固定秩的矩阵和张量形成光滑的黎曼流形,因此寻找低秩逼近的流行工具之一是使用黎曼优化。然而,在黎曼优化算法中有效地实现黎曼梯度和Hessian可能是一项艰巨的任务。而且,在某些情况下,甚至无法获得解析公式。在本文中,我们以自动微分为基础,并提出了一种方法,该方法实现了要最小化的功能,
更新日期:2021-03-30
down
wechat
bug