当前位置: X-MOL 学术Entropy › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Augmenting Paraphrase Generation with Syntax Information Using Graph Convolutional Networks
Entropy ( IF 2.7 ) Pub Date : 2021-05-02 , DOI: 10.3390/e23050566
Xiaoqiang Chi , Yang Xiang

Paraphrase generation is an important yet challenging task in natural language processing. Neural network-based approaches have achieved remarkable success in sequence-to-sequence learning. Previous paraphrase generation work generally ignores syntactic information regardless of its availability, with the assumption that neural nets could learn such linguistic knowledge implicitly. In this work, we make an endeavor to probe into the efficacy of explicit syntactic information for the task of paraphrase generation. Syntactic information can appear in the form of dependency trees, which could be easily acquired from off-the-shelf syntactic parsers. Such tree structures could be conveniently encoded via graph convolutional networks to obtain more meaningful sentence representations, which could improve generated paraphrases. Through extensive experiments on four paraphrase datasets with different sizes and genres, we demonstrate the utility of syntactic information in neural paraphrase generation under the framework of sequence-to-sequence modeling. Specifically, our graph convolutional network-enhanced models consistently outperform their syntax-agnostic counterparts using multiple evaluation metrics.

中文翻译:

使用图卷积网络使用语法信息增强复述生成

复述的生成是自然语言处理中的一个重要但具有挑战性的任务。基于神经网络的方法在序列到序列的学习中取得了巨大的成功。先前的释义生成工作通常会忽略语法信息,而不考虑其可用性,并假设神经网络可以隐式学习此类语言知识。在这项工作中,我们将努力探索明确的句法信息对于释义生成任务的功效。语法信息可以以依赖关系树的形式出现,可以很容易地从现成的语法分析器中获取。可以通过图卷积网络方便地对此类树结构进行编码,以获得更有意义的句子表示形式,从而可以改善生成的释义。通过对具有不同大小和体裁的四个复述数据集进行的广泛实验,我们证明了句法信息在序列到序列建模框架下在神经复述生成中的实用性。具体来说,我们的图卷积网络增强模型使用多个评估指标始终优于其语法不可知的模型。
更新日期:2021-05-03
down
wechat
bug