当前位置: X-MOL 学术arXiv.cs.LO › 论文详情
Improving Graph Neural Network Representations of Logical Formulae with Subgraph Pooling
arXiv - CS - Logic in Computer Science Pub Date : 2019-11-15 , DOI: arxiv-1911.06904
Maxwell Crouse; Ibrahim Abdelaziz; Cristina Cornelio; Veronika Thost; Lingfei Wu; Kenneth Forbus; Achille Fokoue

Recent advances in the integration of deep learning with automated theorem proving have centered around the representation of logical formulae as inputs to deep learning systems. In particular, there has been a growing interest in adapting structure-aware neural methods to work with the underlying graph representations of logical expressions. While more effective than character and token-level approaches, such methods have often made representational trade-offs that limited their ability to capture key structural properties of their inputs. In this work we propose a novel, LSTM-based approach for embedding logical formulae that is designed to overcome the representational limitations of prior approaches. Our proposed architecture works for logics of different expressivity; e.g., first-order and higher-order logic. We evaluate our approach on two standard datasets and show that the proposed architecture improves the performance of premise selection and proof step classification significantly compared to state-of-the-art.
更新日期:2020-02-13

 

全部期刊列表>>
化学/材料学中国作者研究精选
ACS材料视界
南京大学
自然科研论文编辑服务
剑桥大学-
中国科学院大学化学科学学院
南开大学化学院周其林
课题组网站
X-MOL
北京大学分子工程苏南研究院
华东师范大学分子机器及功能材料
中山大学化学工程与技术学院
试剂库存
天合科研
down
wechat
bug