当前位置: X-MOL 学术Neurocomputing › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Learning Bayesian networks using A* search with ancestral constraints
Neurocomputing ( IF 6 ) Pub Date : 2021-04-20 , DOI: 10.1016/j.neucom.2021.04.054
Zidong Wang , Xiaoguang Gao , Xiangyuan Tan , Xiaohan Liu

When using a Bayesian network to model a practical problem, weak prior knowledge projected as ancestral constraints is necessary. However, it is difficult to directly utilize these non-decomposable constraints using search strategies based on the decomposable score. In this study, we attempt to solve this problem by conducting an implicate path-space search graph and driving the A* algorithm, which is used to obtain the globally optimal solution satisfying the given constraints. We use a maximum covering principle to provide useful pruning rules based on these constraints in the new framework. Moreover, we improve the simple heuristic and the static k-cycle conflict heuristic to adapt to ancestral constraints. We theoretically prove that the new heuristic functions remain admissible and consistent. Our experiments demonstrate that the proposed framework with the new heuristic functions significantly reduces the space complexity of A* search compared with state-of-the-art frameworks, such as Bayesian network graphs and equivalent class trees, when integrating ancestral constraints.



中文翻译:

使用具有祖先约束的A *搜索来学习贝叶斯网络

当使用贝叶斯网络对实际问题进行建模时,需要先验知识来预测祖先的约束条件。然而,难以使用基于可分解得分的搜索策略来直接利用这些不可分解的约束。在本研究中,我们尝试通过执行隐式路径空间搜索图并驱动A *算法来解决此问题,该算法用于获得满足给定约束的全局最优解。我们使用最大覆盖原则,根据新框架中的这些约束提供有用的修剪规则。此外,我们改进了简单启发式和静态k周期冲突启发式以适应祖先约束。从理论上讲,我们证明了新的启发式函数仍然是可接受的和一致的。

更新日期:2021-05-08
down
wechat
bug