当前位置: X-MOL 学术Knowl. Based Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Cross-sentence N-ary relation classification using LSTMs on graph and sequence structures
Knowledge-Based Systems ( IF 8.8 ) Pub Date : 2020-07-25 , DOI: 10.1016/j.knosys.2020.106266
Lulu Zhao , Weiran Xu , Sheng Gao , Jun Guo

Relation classification is an important semantic processing task in the field of Natural Language Processing (NLP). The past works mainly focused on binary relations in a single sentence. Recently, cross-sentence N-ary relation classification, which detects relations among n entities across multiple sentences, has been arousing people’s interests. The dependency tree based methods and some Graph Neural Network (GNN) based methods have been carried out to convey rich structural information. However, it is challenging for researchers to fully use the relevant information while ignore the irrelevant information from the dependency trees. In this paper, we propose a Graph Attention-based LSTM (GA LSTM) network to make full use of the relevant graph structure information. The dependency tree of multiple sentences is divided into many subtrees whose root node is a word in the sentence and the leaf nodes are regarded as the neighborhood. A graph attention mechanism is used to aggregate the local information in the neighborhood. Using this network, we identify the relevant information from the dependency tree. On the other hand, because the GNNs highly depend on the graph structure of the sentence and lack context sequence structural information, their effectiveness to the task is limited. To tackle this problem, we propose an N-gram Graph LSTM (NGG LSTM) network, which updates the hidden states by aggregating graph neighbor node information and the inherent sequence structural information of sentence. The experimental results show that our methods outperform most of the existing methods.



中文翻译:

使用LSTM在图和序列结构上进行跨句子N元关系分类

关系分类是自然语言处理(NLP)领域中的重要语义处理任务。过去的作品主要集中于单个句子中的二进制关系。最近,交叉句子N元关系分类,它检测n之间的关系实体跨越多个句子,一直在唤起人们的兴趣。已经进行了基于依赖树的方法和一些基于图神经网络(GNN)的方法来传达丰富的结构信息。但是,对于研究人员来说,要充分利用相关信息,而忽略依赖树中的无关信息,则是一个挑战。在本文中,我们提出了一个基于图注意力的LSTM(GA LSTM)网络,以充分利用相关的图结构信息。多个句子的依存关系树被分为许多子树,这些子树的根节点是句子中的单词,叶节点被视为邻域。图关注机制用于聚集邻域中的本地信息。使用此网络,我们从依赖关系树中识别相关信息。另一方面,由于GNN高度依赖于句子的图结构而缺乏上下文序列结构信息,因此它们对任务的有效性受到限制。为了解决这个问题,我们提出了一种N-gram图LSTM(NGG LSTM)网络,该网络通过聚合图邻居节点信息和句子的固有序列结构信息来更新隐藏状态。实验结果表明,我们的方法优于大多数现有方法。通过聚集图邻居节点信息和句子的固有序列结构信息来更新隐藏状态。实验结果表明,我们的方法优于大多数现有方法。通过聚集图邻居节点信息和句子的固有序列结构信息来更新隐藏状态。实验结果表明,我们的方法优于大多数现有方法。

更新日期:2020-08-23
down
wechat
bug