当前位置: X-MOL 学术J. Korean Stat. Soc. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Kullback–Leibler divergence for Bayesian nonparametric model checking
Journal of the Korean Statistical Society ( IF 0.6 ) Pub Date : 2020-06-04 , DOI: 10.1007/s42952-020-00072-7
Luai Al-Labadi , Vishakh Patel , Kasra Vakiloroayaei , Clement Wan

Bayesian nonparametric statistics is an area of considerable research interest. While recently there has been an extensive concentration in developing Bayesian nonparametric procedures for model checking, the use of the Dirichlet process, in its simplest form, along with the Kullback–Leibler divergence is still an open problem. This is mainly attributed to the discreteness property of the Dirichlet process and that the Kullback–Leibler divergence between any discrete distribution and any continuous distribution is infinity. The approach proposed in this paper, which is based on incorporating the Dirichlet process, the Kullback–Leibler divergence and the relative belief ratio, is considered the first concrete solution to this issue. Applying the approach is simple and does not require obtaining a closed form of the relative belief ratio. A Monte Carlo study and real data examples show that the developed approach exhibits excellent performance.



中文翻译:

贝叶斯非参数模型检查的Kullback-Leibler散度

贝叶斯非参数统计是一个值得研究的领域。尽管最近人们广泛致力于开发用于模型检查的贝叶斯非参数程序,但使用最简单形式的Dirichlet程序以及Kullback-Leibler散度仍然是一个未解决的问题。这主要归因于Dirichlet过程的离散性,以及任何离散分布和任何连续分布之间的Kullback-Leibler散度都是无穷大。本文提出的方法是基于结合Dirichlet过程,Kullback-Leibler散度和相对置信度的方法,被认为是解决此问题的第一个具体解决方案。应用该方法很简单,不需要获得相对置信度的封闭形式。

更新日期:2020-07-24
down
wechat
bug