当前位置: X-MOL 学术arXiv.cs.FL › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Connecting Weighted Automata, Tensor Networks and Recurrent Neural Networks through Spectral Learning
arXiv - CS - Formal Languages and Automata Theory Pub Date : 2020-10-19 , DOI: arxiv-2010.10029
Tianyu Li, Doina Precup, Guillaume Rabusseau

In this paper, we present connections between three models used in different research fields: weighted finite automata~(WFA) from formal languages and linguistics, recurrent neural networks used in machine learning, and tensor networks which encompasses a set of optimization techniques for high-order tensors used in quantum physics and numerical analysis. We first present an intrinsic relation between WFA and the tensor train decomposition, a particular form of tensor network. This relation allows us to exhibit a novel low rank structure of the Hankel matrix of a function computed by a WFA and to design an efficient spectral learning algorithm leveraging this structure to scale the algorithm up to very large Hankel matrices. We then unravel a fundamental connection between WFA and second-order recurrent neural networks~(2-RNN): in the case of sequences of discrete symbols, WFA and 2-RNN with linear activation functions are expressively equivalent. Furthermore, we introduce the first provable learning algorithm for linear 2-RNN defined over sequences of continuous input vectors. This algorithm relies on estimating low rank sub-blocks of the Hankel tensor, from which the parameters of a linear 2-RNN can be provably recovered. The performances of the proposed learning algorithm are assessed in a simulation study on both synthetic and real-world data.

中文翻译:

通过谱学习连接加权自动机、张量网络和循环神经网络

在本文中,我们展示了用于不同研究领域的三个模型之间的联系:来自形式语言和语言学的加权有限自动机~(WFA)、用于机器学习的循环神经网络,以及包含一组优化技术的张量网络。用于量子物理学和数值分析的阶张量。我们首先介绍了 WFA 与张量训练分解(一种特殊形式的张量网络)之间的内在关系。这种关系使我们能够展示由 WFA 计算的函数的 Hankel 矩阵的新型低秩结构,并设计一种有效的谱学习算法,利用这种结构将算法扩展到非常大的 Hankel 矩阵。然后我们解开 WFA 和二阶循环神经网络之间的基本联系~(2-RNN):在离散符号序列的情况下,具有线性激活函数的 WFA 和 2-RNN 在表达上是等效的。此外,我们为在连续输入向量序列上定义的线性 2-RNN 引入了第一个可证明的学习算法。该算法依赖于估计 Hankel 张量的低秩子块,从中可以证明可以恢复线性 2-RNN 的参数。在对合成数据和真实世界数据的模拟研究中评估了所提出的学习算法的性能。从中可以证明可以恢复线性 2-RNN 的参数。在对合成数据和真实世界数据的模拟研究中评估了所提出的学习算法的性能。从中可以证明可以恢复线性 2-RNN 的参数。在对合成数据和真实世界数据的模拟研究中评估了所提出的学习算法的性能。
更新日期:2020-10-21
down
wechat
bug