当前位置: X-MOL 学术Knowl. Based Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Direction-sensitive relation extraction using Bi-SDP attention model
Knowledge-Based Systems ( IF 7.2 ) Pub Date : 2020-04-24 , DOI: 10.1016/j.knosys.2020.105928
Hailin Wang , Ke Qin , Guoming Lu , Guangchun Luo , Guisong Liu

Relation extraction is a crucial task of natural language processing (NLP). It plays a key role in question answering, web search, and information retrieval and so on. Previous research on this task has verified the effectiveness of using attention mechanisms, shortest dependency paths (SDP) and LSTM. However, most of these methods focus on learning a semantic representation of the whole sentence, highlighting the importance of partial words, or pruning the sentence with SDP. They ignore the lose of information in these methods, such as the dependency relation of each word and preposition words to indicate the relation direction. Besides, the SDP-based approach is prone to over-pruning. Based on the above observations, this paper presents a framework with a Bi-directional SDP (Bi-SDP) attention mechanism to tackle these challenges. The Bi-SDP is a novel representation of SDP, including original SDP and its reverse. The attention mechanism, based on Bi-SDP, builds a parallel word-level attention to capture relational semantic words and directional words. Furthermore, we explored a novel pruning strategy to minimize the length of input instance and the number of RNN cells simultaneously. Moreover, experiments are conducted on two datasets: SemEval-2010 Task 8 dataset and KBP37 dataset. Compared with the previous public models, our method can achieve better competitive performance on the SemEval-2010 Task 8 dataset and outperform existing models on the KBP37 dataset. Additionally, our experimental results also evidence that the directional prepositions in sentences are useful for relation extraction and can improve the performance of relationship with apparent physical direction.



中文翻译:

使用Bi-SDP注意模型的方向敏感关系提取

关系提取是自然语言处理(NLP)的关键任务。它在问题解答,Web搜索和信息检索等方面起着关键作用。对此任务的先前研究已经验证了使用注意力机制,最短依赖路径(SDP)和LSTM的有效性。但是,这些方法大多数都集中于学习整个句子的语义表示,强调部分单词的重要性或使用SDP修剪句子。他们忽略了这些方法中信息的丢失,例如每个单词的依赖关系和介词来指示关系方向。此外,基于SDP的方法易于过度修剪。基于上述观察,本文提出了一种具有双向SDP(Bi-SDP)注意机制的框架来应对这些挑战。Bi-SDP是SDP的新颖代表,包括原始SDP及其反向。基于Bi-SDP的注意力机制建立了并行的词级注意力以捕获关系语义词和方向词。此外,我们探索了一种新颖的修剪策略,以最大程度地减少输入实例的长度和RNN单元的数量。此外,还对两个数据集进行了实验:SemEval-2010 Task 8数据集和KBP37数据集。与以前的公共模型相比,我们的方法可以在SemEval-2010 Task 8数据集上实现更好的竞争性能,并且胜过KBP37数据集上的现有模型。另外,

更新日期:2020-04-24
down
wechat
bug