当前位置: X-MOL 学术arXiv.cs.NE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Beyond Graph Neural Networks with Lifted Relational Neural Networks
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2020-07-13 , DOI: arxiv-2007.06286
Gustav Sourek, Filip Zelezny, Ondrej Kuzelka

We demonstrate a declarative differentiable programming framework based on the language of Lifted Relational Neural Networks, where small parameterized logic programs are used to encode relational learning scenarios. When presented with relational data, such as various forms of graphs, the program interpreter dynamically unfolds differentiable computational graphs to be used for the program parameter optimization by standard means. Following from the used declarative Datalog abstraction, this results into compact and elegant learning programs, in contrast with the existing procedural approaches operating directly on the computational graph level. We illustrate how this idea can be used for an efficient encoding of a diverse range of existing advanced neural architectures, with a particular focus on Graph Neural Networks (GNNs). Additionally, we show how the contemporary GNN models can be easily extended towards higher relational expressiveness. In the experiments, we demonstrate correctness and computation efficiency through comparison against specialized GNN deep learning frameworks, while shedding some light on the learning performance of existing GNN models.

中文翻译:

使用提升关系神经网络超越图神经网络

我们展示了一个基于提升关系神经网络语言的声明式可微编程框架,其中使用小型参数化逻辑程序来编码关系学习场景。当呈现关系数据,例如各种形式的图时,程序解释器动态展开可微计算图,以通过标准方式用于程序参数优化。根据使用的声明性 Datalog 抽象,这导致了紧凑而优雅的学习程序,与直接在计算图级别上运行的现有程序方法形成对比。我们说明了如何将这个想法用于对各种现有的高级神经架构进行有效编码,特别关注图神经网络 (GNN)。此外,我们展示了当代 GNN 模型如何可以轻松地扩展到更高的关系表达能力。在实验中,我们通过与专门的 GNN 深度学习框架进行比较来证明正确性和计算效率,同时阐明了现有 GNN 模型的学习性能。
更新日期:2020-07-15
down
wechat
bug