当前位置: X-MOL 学术Neural Comput. & Applic. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
A deep embedding model for knowledge graph completion based on attention mechanism
Neural Computing and Applications ( IF 6 ) Pub Date : 2021-03-16 , DOI: 10.1007/s00521-021-05742-z
Jin Huang , TingHua Zhang , Jia Zhu , Weihao Yu , Yong Tang , Yang He

Knowledge graph completion has become a well-studied problem and a non-trivial task with the broad application of the knowledge graphs. Previously, a lot of works have been proposed to solve the knowledge graph completion problem, for example, a series of Trans model, semantic matching models, convolutional neural networks based methods and so on. However, a series of Trans models and semantic matching models only focused on the shadow information of the knowledge graph, thus failed to capture the implicit fine-grained feature in the triple of knowledge graphs; convolutional neural networks based methods learned more expressive feature for knowledge graph completion, and it also ignored the directional relation characteristic and implicit fine-grained feature in the triple. In this paper, we propose a novel knowledge graph completion model named directional multi-dimensional attention convolution model that explores directional information and an inherent deep expressive characteristic of the triple. At last, we evaluate our directional multi-dimensional attention convolution model based on three standard evaluation criteria in two robust datasets, and the experiment shows that our model achieves state-of-the-art MeanRank.



中文翻译:

基于注意力机制的知识图完成深度嵌入模型

随着知识图谱的广泛应用,知识图谱的完成已成为一个经过充分研究的问题和一项艰巨的任务。以前,已经提出了许多解决知识图完成问题的工作,例如,一系列的Trans模型,语义匹配模型,基于卷积神经网络的方法等。但是,由于一系列的Trans模型和语义匹配模型只关注知识图的影子信息,因此无法捕获知识图三元组中的隐式细粒度特征。基于卷积神经网络的方法学习了知识图完成的更多表达特征,并且也忽略了三元组中的方向关系特征和隐式细粒度特征。在本文中,我们提出了一种新颖的知识图完成模型,称为定向多维注意力卷积模型,该模型探索了方向信息和三元组固有的深层表达特征。最后,我们在两个稳健的数据集中基于三个标准评估标准评估了定向多维注意力卷积模型,实验表明,该模型实现了最先进的MeanRank。

更新日期:2021-03-16
down
wechat
bug