当前位置: X-MOL 学术arXiv.cs.LG › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
What if Neural Networks had SVDs?
arXiv - CS - Machine Learning Pub Date : 2020-09-29 , DOI: arxiv-2009.13977
Alexander Mathiasen, Frederik Hvilsh{\o}j, Jakob R{\o}dsgaard J{\o}rgensen, Anshul Nasery, Davide Mottin

Various Neural Networks employ time-consuming matrix operations like matrix inversion. Many such matrix operations are faster to compute given the Singular Value Decomposition (SVD). Previous work allows using the SVD in Neural Networks without computing it. In theory, the techniques can speed up matrix operations, however, in practice, they are not fast enough. We present an algorithm that is fast enough to speed up several matrix operations. The algorithm increases the degree of parallelism of an underlying matrix multiplication $H\cdot X$ where $H$ is an orthogonal matrix represented by a product of Householder matrices. Code is available at www.github.com/AlexanderMath/fasth .

中文翻译:

如果神经网络有 SVD 会怎样?

各种神经网络采用耗时的矩阵运算,如矩阵求逆。给定奇异值分解 (SVD),许多此类矩阵运算的计算速度更快。以前的工作允许在神经网络中使用 SVD 而无需计算它。理论上,这些技术可以加速矩阵运算,但在实践中,它们还不够快。我们提出了一种足够快的算法来加速几个矩阵运算。该算法增加了底层矩阵乘法 $H\cdot X$ 的并行度,其中 $H$ 是由 Householder 矩阵的乘积表示的正交矩阵。代码可在 www.github.com/AlexanderMath/fasth 获得。
更新日期:2020-09-30
down
wechat
bug