当前位置: X-MOL 学术Stat. Model. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Applications of a Kullback-Leibler Divergence for Comparing Non-nested Models.
Statistical Modelling ( IF 1 ) Pub Date : 2013-12-01 , DOI: 10.1177/1471082x13494610
Chen-Pin Wang 1 , Booil Jo 2
Affiliation  

Wang and Ghosh (2011) proposed a Kullback-Leibler divergence (KLD) which is asymptotically equivalent to the KLD by Goutis and Robert (1998) when the reference model (in comparison with a competing fitted model) is correctly specified and when certain regularity conditions hold true. While properties of the KLD by Wang and Ghosh (2011) have been investigated in the Bayesian framework, this paper further explores the property of this KLD in the frequentist framework using four application examples, each fitted by two competing non-nested models.

中文翻译:

Kullback-Leibler 散度在比较非嵌套模型中的应用。

Wang 和 Ghosh (2011) 提出了 Kullback-Leibler 散度 (KLD),当参考模型(与竞争拟合模型相比)被正确指定并且具有某些规律性条件时,它与 Goutis 和 Robert (1998) 的 KLD 渐近等效保持真实。虽然 Wang 和 Ghosh (2011) 已经在贝叶斯框架中研究了 KLD 的特性,但本文使用四个应用示例进一步探讨了这个 KLD 在频率论框架中的特性,每个应用示例都由两个相互竞争的非嵌套模型拟合。
更新日期:2019-11-01
down
wechat
bug