当前位置: X-MOL 学术arXiv.cs.SC › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Improving Graph Neural Network Representations of Logical Formulae with Subgraph Pooling
arXiv - CS - Symbolic Computation Pub Date : 2019-11-15 , DOI: arxiv-1911.06904
Maxwell Crouse, Ibrahim Abdelaziz, Cristina Cornelio, Veronika Thost, Lingfei Wu, Kenneth Forbus, Achille Fokoue

Recent advances in the integration of deep learning with automated theorem proving have centered around the representation of logical formulae as inputs to deep learning systems. In particular, there has been a growing interest in adapting structure-aware neural methods to work with the underlying graph representations of logical expressions. While more effective than character and token-level approaches, graph-based methods have often made representational trade-offs that limited their ability to capture key structural properties of their inputs. In this work we propose a novel approach for embedding logical formulae that is designed to overcome the representational limitations of prior approaches. Our architecture works for logics of different expressivity; e.g., first-order and higher-order logic. We evaluate our approach on two standard datasets and show that the proposed architecture achieves state-of-the-art performance on both premise selection and proof step classification.

中文翻译:

使用子图池改进逻辑公式的图神经网络表示

深度学习与自动定理证明集成的最新进展集中在将逻辑公式表示为深度学习系统的输入。特别是,人们越来越关注采用结构感知神经方法来处理逻辑表达式的底层图形表示。虽然比字符和标记级别的方法更有效,但基于图的方法通常会进行表征权衡,这限制了它们捕获输入的关键结构属性的能力。在这项工作中,我们提出了一种嵌入逻辑公式的新方法,旨在克服先前方法的表示限制。我们的架构适用于不同表现力的逻辑;例如,一阶和高阶逻辑。
更新日期:2020-06-08
down
wechat
bug