当前位置: X-MOL 学术Commun. Math. Stat. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Information Divergence and the Generalized Normal Distribution: A Study on Symmetricity
Communications in Mathematics and Statistics ( IF 0.9 ) Pub Date : 2020-02-10 , DOI: 10.1007/s40304-019-00200-8
Thomas L. Toulias , Christos P. Kitsos

This paper investigates and discusses the use of information divergence, through the widely used Kullback–Leibler (KL) divergence, under the multivariate (generalized) \(\gamma \)-order normal distribution (\(\gamma \)-GND). The behavior of the KL divergence, as far as its symmetricity is concerned, is studied by calculating the divergence of \(\gamma \)-GND over the Student’s multivariate t-distribution and vice versa. Certain special cases are also given and discussed. Furthermore, three symmetrized forms of the KL divergence, i.e., the Jeffreys distance, the geometric-KL as well as the harmonic-KL distances, are computed between two members of the \(\gamma \)-GND family, while the corresponding differences between those information distances are also discussed.



中文翻译:

信息散度与广义正态分布:对称性研究

本文通过广泛使用的Kullback-Leibler(KL)散度,在多元(广义)\(\ gamma \)-正态分布(\(\ gamma \)- GND)下,研究并讨论了信息散度的使用。通过计算学生多元t分布上\(\ gamma \)- GND的散度,研究了KL散度的行为(就其对称性而言),反之亦然。还给出并讨论了某些特殊情况。此外,在\(\ gamma \)的两个成员之间计算了KL发散的三种对称形式,即杰弗里斯距离,几何KL以及谐波KL距离。-GND系列,同时还讨论了这些信息距离之间的相应差异。

更新日期:2020-02-10
down
wechat
bug