当前位置: X-MOL 学术arXiv.cs.AI › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
MLMLM: Link Prediction with Mean Likelihood Masked Language Model
arXiv - CS - Artificial Intelligence Pub Date : 2020-09-15 , DOI: arxiv-2009.07058
Louis Clouatre, Philippe Trempe, Amal Zouaq, Sarath Chandar

Knowledge Bases (KBs) are easy to query, verifiable, and interpretable. They however scale with man-hours and high-quality data. Masked Language Models (MLMs), such as BERT, scale with computing power as well as unstructured raw text data. The knowledge contained within those models is however not directly interpretable. We propose to perform link prediction with MLMs to address both the KBs scalability issues and the MLMs interpretability issues. To do that we introduce MLMLM, Mean Likelihood Masked Language Model, an approach comparing the mean likelihood of generating the different entities to perform link prediction in a tractable manner. We obtain State of the Art (SotA) results on the WN18RR dataset and the best non-entity-embedding based results on the FB15k-237 dataset. We also obtain convincing results on link prediction on previously unseen entities, making MLMLM a suitable approach to introducing new entities to a KB.

中文翻译:

MLMLM:使用平均似然掩码语言模型进行链接预测

知识库 (KB) 易于查询、验证和解释。然而,它们会随着工时和高质量数据进行扩展。掩码语言模型 (MLM),例如 BERT,随着计算能力以及非结构化原始文本数据而扩展。然而,这些模型中包含的知识不能直接解释。我们建议使用 MLM 执行链接预测,以解决 KB 可扩展性问题和 MLM 可解释性问题。为此,我们引入了 MLMLM,Mean Likelihood Masked Language Model,这是一种比较生成不同实体的平均可能性以易于处理的方式执行链接预测的方法。我们在 WN18RR 数据集上获得了最先进的 (SotA) 结果,并在 FB15k-237 数据集上获得了基于非实体嵌入的最佳结果。
更新日期:2020-09-16
down
wechat
bug