当前位置: X-MOL 学术arXiv.cs.FL › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Synthesizing Context-free Grammars from Recurrent Neural Networks (Extended Version)
arXiv - CS - Formal Languages and Automata Theory Pub Date : 2021-01-20 , DOI: arxiv-2101.08200
Daniel M. Yellin, Gail Weiss

We present an algorithm for extracting a subclass of the context free grammars (CFGs) from a trained recurrent neural network (RNN). We develop a new framework, pattern rule sets (PRSs), which describe sequences of deterministic finite automata (DFAs) that approximate a non-regular language. We present an algorithm for recovering the PRS behind a sequence of such automata, and apply it to the sequences of automata extracted from trained RNNs using the L* algorithm. We then show how the PRS may converted into a CFG, enabling a familiar and useful presentation of the learned language. Extracting the learned language of an RNN is important to facilitate understanding of the RNN and to verify its correctness. Furthermore, the extracted CFG can augment the RNN in classifying correct sentences, as the RNN's predictive accuracy decreases when the recursion depth and distance between matching delimiters of its input sequences increases.

中文翻译:

从递归神经网络合成上下文无关文法(扩展版)

我们提出一种算法,用于从训练有素的递归神经网络(RNN)中提取上下文无关文法(CFG)的子类。我们开发了一种新的框架,即模式规则集(PRS),用于描述近似于非规则语言的确定性有限自动机(DFA)序列。我们提出了一种在此类自动机序列后面恢复PRS的算法,并将其应用于使用L *算法从受训RNN中提取的自动机序列。然后,我们展示PRS如何转换为CFG,从而实现对所学语言的熟悉且有用的呈现。提取RNN的学习语言对于促进对RNN的理解并验证其正确性很重要。此外,提取的CFG可以在将正确的句子分类为RNN'时增强RNN
更新日期:2021-01-21
down
wechat
bug