当前位置: X-MOL 学术Knowl. Based Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Kernel multi-attention neural network for knowledge graph embedding
Knowledge-Based Systems ( IF 7.2 ) Pub Date : 2021-06-04 , DOI: 10.1016/j.knosys.2021.107188
Dan Jiang , Ronggui Wang , Juan Yang , Lixia Xue

Link prediction is the problem of predicting missing link between entities and relations for knowledge graph. In recent years, some tasks have achieved great success for link prediction, but these tasks are far from expanding entity relation vectors, and cannot predict missing links more efficiently. In this paper, we propose a novel link prediction method called kernel multi-attention neural network for knowledge graph embedding (KMAE) which is able to extend kernel separately in entity and relation attributes. The kernel function uses Gaussian kernel function to expand into more robust entity kernel and relation kernel. In addition, we constructed a novel multi-attention neural network that acts on the entity kernel and relation kernel which can capture local important characteristics. Experiments on FB15k-237 and WN18RR, show that multi-attention fully reflect excellent performance in the task of knowledge graph embedding. Our proposed KMAE achieves better results than previous state-of-the-art link prediction methods.



中文翻译:

用于知识图嵌入的内核多注意神经网络

链接预测是为知识图谱预测实体和关系之间缺失的链接的问题。近年来,一些任务在链接预测方面取得了很大的成功,但这些任务远没有扩展实体关系向量,无法更有效地预测缺失的链接。在本文中,我们提出了一种新的链接预测方法,称为用于知识图嵌入的内核多注意神经网络(KMAE),它能够在实体和关系属性中分别扩展内核。核函数使用高斯核函数扩展为更鲁棒的实体核和关系核。此外,我们构建了一个新颖的多注意神经网络,作用于实体核和关系核,可以捕获局部重要特征。在 FB15k-237 和 WN18RR 上的实验,表明多注意力在知识图嵌入任务中充分体现了优异的性能。我们提出的 KMAE 比以前最先进的链接预测方法取得了更好的结果。

更新日期:2021-06-09
down
wechat
bug