当前位置: X-MOL 学术Neurocomputing › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Relation-based multi-type aware knowledge graph embedding
Neurocomputing ( IF 5.5 ) Pub Date : 2021-05-11 , DOI: 10.1016/j.neucom.2021.05.021
Yingying Xue , Jiahui Jin , Aibo Song , Yingxue Zhang , Yangyang Liu , Kaixuan Wang

Knowledge graph (KG) embedding projects the graph into a low-dimensional space and preserves the graph information. An essential part of a KG is the ontology, which always is organized as a taxonomy tree, depicting the type (or multiple types) of each entity and the hierarchical relationships among these types. The importance of considering the ontology during KG embedding lies in its ability to provide side-information, improving the downstream applications’ accuracy (e.g., link prediction, entity alignment or recommendation). However, the ontology has yet to receive adequate attention during the KG embedding, especially for instances where each entity may belong to multiple types. This ontology-enhanced KG embedding’s main challenges are twofold: determining how to discover the relationships among these types and how to integrate them with the entities’ relationship network. Although it is common to see attention-based models used in KG embedding, they cannot settle the issues raised simultaneously. Only a single type is assigned to each entity and the correlation among types are ignored in those models, leading to information loss and encumbered downstream tasks. To overcome these challenges, we propose a composite multi-type aware KG embedding model, whose main components are a multi-type layer and entity embedding layer. We model it as a natural language processing task at the multi-type layer to discover each entity’s multi-type feature and automatically capture their correlations. Additionally, a relation-based attention mechanism is conducted at the entity embedding layer, which aggregates neighborhoods’ information and integrates the multi-type layer’s information through common entities of these two layers. Through extensive experiments on two real KGs, we demonstrate that, compared to several state-of-the-art baselines, our Multi-Type aware Embedding (MTE) model achieves substantial gain in both Mean Rank and Hit@N for the link prediction task and accuracy for multi-type classification.



中文翻译:

基于关系的多类型感知知识图嵌入

知识图(KG)嵌入将图投影到低维空间并保留图信息。KG 的一个重要部分是本体,它总是被组织成一个分类树,描述了每个实体的类型(或多个类型)以及这些类型之间的层次关系。在 KG 嵌入过程中考虑本体的重要性在于它能够提供辅助信息,提高下游应用程序的准确性(例如,链接预测、实体对齐或推荐)。然而,本体在 KG 嵌入过程中还没有得到足够的关注,特别是对于每个实体可能属于多种类型的情况。这种本体增强的 KG 嵌入的主要挑战有两个:确定如何发现这些类型之间的关系以及如何将它们与实体的关系网络集成。尽管在 KG 嵌入中使用基于注意力的模型很常见,但它们无法同时解决提出的问题。每个实体只分配一个类型,并且在这些模型中忽略了类型之间的相关性,导致信息丢失并阻碍下游任务。为了克服这些挑战,我们提出了一种复合多类型感知 KG 嵌入模型,其主要组件是多类型层和实体嵌入层。我们将其建模为多类型层的自然语言处理任务,以发现每个实体的多类型特征并自动捕获它们的相关性。此外,在实体嵌入层进行了基于关系的注意力机制,通过这两个层的公共实体聚合邻域信息并整合多类型层的信息。通过对两个真实 KG 的大量实验,我们证明,与几个最先进的基线相比,我们的多类型感知嵌入 (MTE) 模型在链接预测任务的 Mean Rank 和 Hit@N 上都取得了实质性的进步和多类型分类的准确性。

更新日期:2021-06-09
down
wechat
bug