当前位置: X-MOL 学术Entropy › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
ϕ-Informational Measures: Some Results and Interrelations
Entropy ( IF 2.7 ) Pub Date : 2021-07-18 , DOI: 10.3390/e23070911
Steeve Zozor 1 , Jean-François Bercher 2
Affiliation  

In this paper, we focus on extended informational measures based on a convex function ϕ: entropies, extended Fisher information, and generalized moments. Both the generalization of the Fisher information and the moments rely on the definition of an escort distribution linked to the (entropic) functional ϕ. We revisit the usual maximum entropy principle—more precisely its inverse problem, starting from the distribution and constraints, which leads to the introduction of state-dependent ϕ-entropies. Then, we examine interrelations between the extended informational measures and generalize relationships such the Cramér–Rao inequality and the de Bruijn identity in this broader context. In this particular framework, the maximum entropy distributions play a central role. Of course, all the results derived in the paper include the usual ones as special cases.

中文翻译:

ϕ-信息测量:一些结果和相互关系

在本文中,我们专注于基于凸函数的扩展信息度量 φ:熵、扩展的 Fisher 信息和广义矩。Fisher 信息的泛化和矩都依赖于与(熵)函数相关的护送分布的定义φ. 我们重新审视通常的最大熵原理——更准确地说是它的逆问题,从分布和约束开始,这导致了状态依赖的引入φ-熵。然后,我们检查扩展信息度量之间的相互关系,并在这个更广泛的背景下概括关系,例如 Cramér-Rao 不等式和 de Bruijn 身份。在这个特定的框架中,最大熵分布起着核心作用。当然,论文中推导出的所有结果都包括通常的作为特例的结果。
更新日期:2021-07-19
down
wechat
bug