当前位置: X-MOL 学术Mach. Learn. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Top program construction and reduction for polynomial time Meta-Interpretive learning
Machine Learning ( IF 7.5 ) Pub Date : 2021-02-08 , DOI: 10.1007/s10994-020-05945-w
S. Patsantzis , S. H. Muggleton

Meta-Interpretive Learners, like most ILP systems, learn by searching for a correct hypothesis in the hypothesis space, the powerset of all constructible clauses. We show how this exponentially-growing search can be replaced by the construction of a Top program: the set of clauses in all correct hypotheses that is itself a correct hypothesis. We give an algorithm for Top program construction and show that it constructs a correct Top program in polynomial time and from a finite number of examples. We implement our algorithm in Prolog as the basis of a new MIL system, Louise, that constructs a Top program and then reduces it by removing redundant clauses. We compare Louise to the state-of-the-art search-based MIL system Metagol in experiments on grid world navigation, graph connectedness and grammar learning datasets and find that Louise improves on Metagol’s predictive accuracy when the hypothesis space and the target theory are both large, or when the hypothesis space does not include a correct hypothesis because of “classification noise” in the form of mislabelled examples. When the hypothesis space or the target theory are small, Louise and Metagol perform equally well.



中文翻译:

多项式时间元解释学习的顶级程序构造和简化

像大多数ILP系统一样,元解释学习者通过在假设空间(所有可构造子句的幂集)中搜索正确的假设来学习。我们展示了如何用Top程序的构造来代替这种指数增长的搜索:所有正确假设中的子句集本身就是一个正确的假设。我们给出了用于Top程序构造的算法,并显示了它在多项式时间内并从有限数量的示例中构造了正确的Top程序。我们在Prolog中实现我们的算法,作为新MIL系统Louise的基础,该系统构造一个Top程序,然后通过删除多余的子句来减少它。在网格世界导航的实验中,我们将Louise与基于搜索的MIL系统Metagol进行了比较,通过对连通性和语法学习数据集进行图形化分析,发现当假设空间和目标理论都很大时,或者由于“分类噪声”而以错误标记的形式出现的假设空间和目标理论都很大时,Louise会提高Metagol的预测准确性。当假设空间或目标理论较小时,路易丝和Metagol的表现同样出色。

更新日期:2021-02-09
down
wechat
bug