当前位置: X-MOL 学术arXiv.cs.DB › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
BERT Meets Relational DB: Contextual Representations of Relational Databases
arXiv - CS - Databases Pub Date : 2021-04-30 , DOI: arxiv-2104.14914
Siddhant Arora, Vinayak Gupta, Garima Gaur, Srikanta Bedathur

In this paper, we address the problem of learning low dimension representation of entities on relational databases consisting of multiple tables. Embeddings help to capture semantics encoded in the database and can be used in a variety of settings like auto-completion of tables, fully-neural query processing of relational joins queries, seamlessly handling missing values, and more. Current work is restricted to working with just single table, or using pretrained embeddings over an external corpus making them unsuitable for use in real-world databases. In this work, we look into ways of using these attention-based model to learn embeddings for entities in the relational database. We are inspired by BERT style pretraining methods and are interested in observing how they can be extended for representation learning on structured databases. We evaluate our approach of the autocompletion of relational databases and achieve improvement over standard baselines.

中文翻译:

BERT遇到关系数据库:关系数据库的上下文表示

在本文中,我们解决了在由多个表组成的关系数据库中学习实体的低维表示的问题。嵌入有助于捕获编码在数据库中的语义,并且可以在各种设置中使用,例如表的自动完成,关系联接查询的全神经查询处理,无缝处理缺失值等等。当前的工作仅限于仅使用单个表,或在外部语料库上使用经过预训练的嵌入,这使其不适用于现实数据库。在这项工作中,我们研究了使用这些基于注意力的模型来学习关系数据库中实体的嵌入的方法。我们受到BERT样式预训练方法的启发,并且对观察如何扩展它们以在结构化数据库上进行表示学习感兴趣。
更新日期:2021-05-03
down
wechat
bug