当前位置: X-MOL 学术Constraints › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Learning optimal decision trees using constraint programming
Constraints ( IF 0.5 ) Pub Date : 2020-10-29 , DOI: 10.1007/s10601-020-09312-3
Hélène Verhaeghe , Siegfried Nijssen , Gilles Pesant , Claude-Guy Quimper , Pierre Schaus

Decision trees are among the most popular classification models in machine learning. Traditionally, they are learned using greedy algorithms. However, such algorithms pose several disadvantages: it is difficult to limit the size of the decision trees while maintaining a good classification accuracy, and it is hard to impose additional constraints on the models that are learned. For these reasons, there has been a recent interest in exact and flexible algorithms for learning decision trees. In this paper, we introduce a new approach to learn decision trees using constraint programming. Compared to earlier approaches, we show that our approach obtains better performance, while still being sufficiently flexible to allow for the inclusion of constraints. Our approach builds on three key building blocks: (1) the use of AND/OR search, (2) the use of caching, (3) the use of the CoverSize global constraint proposed recently for the problem of itemset mining. This allows our constraint programming approach to deal in a much more efficient way with the decompositions in the learning problem.



中文翻译:

使用约束编程学习最佳决策树

决策树是机器学习中最受欢迎的分类模型之一。传统上,它们是使用贪婪算法学习的。然而,这样的算法具有几个缺点:难以在维持良好分类精度的同时限制决策树的大小,并且难以对所学习的模型施加额外的约束。由于这些原因,最近对用于学习决策树的精确且灵活的算法产生了兴趣。在本文中,我们介绍了一种使用约束编程学习决策树的新方法。与早期方法相比,我们证明了我们的方法具有更好的性能,同时仍然足够灵活以允许包含约束。我们的方法建立在三个关键的构建块上:(1)使用AND / OR搜索,(2)使用缓存,(3)使用最近针对项集挖掘问题建议的CoverSize全局约束。这使我们的约束编程方法可以更有效地处理学习问题中的分解。

更新日期:2020-10-30
down
wechat
bug