当前位置: X-MOL 学术Theory Probab. Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
On the Maximum Entropy of a Sum of Independent Discrete Random Variables
Theory of Probability and Its Applications ( IF 0.6 ) Pub Date : 2021-11-02 , DOI: 10.1137/s0040585x97t99054x
M. Kovačević

Theory of Probability &Its Applications, Volume 66, Issue 3, Page 482-487, January 2021.
Let $X_1, \dots, X_n$ be independent random variables taking values in the alphabet $\{0, 1, \dots, r\}$, and let $S_n=\sum_{i=1}^n X_i$. The Shepp--Olkin theorem states that in the binary case (${r=1}$), the Shannon entropy of $S_n$ is maximized when all the $X_i$'s are uniformly distributed, i.e., Bernoulli(1/2). In an attempt to generalize this theorem to arbitrary finite alphabets, we obtain a lower bound on the maximum entropy of $S_n$ and prove that it is tight in several special cases. In addition to these special cases, an argument is presented supporting the conjecture that the bound represents the optimal value for all $n$, $r$, i.e., that $H(S_n)$ is maximized when $X_1, \dots, X_{n-1}$ are uniformly distributed over $\{0, r\}$, while the probability mass function of $X_n$ is a mixture (with explicitly defined nonzero weights) of the uniform distributions over $\{0, r\}$ and $\{1, \dots, r-1\}$.


中文翻译:

关于独立离散随机变量和的最大熵

概率论及其应用,第 66 卷,第 3 期,第 482-487 页,2021 年 1 月。
令 $X_1, \dots, X_n$ 为取值在字母表 $\{0, 1, \dots, r\}$ 中的独立随机变量,并令 $S_n=\sum_{i=1}^n X_i$。Shepp--Olkin 定理指出,在二元情况 (${r=1}$) 中,当所有 $X_i$ 均匀分布时,$S_n$ 的香农熵最大化,即 Bernoulli(1/2 )。为了将这个定理推广到任意有限字母表,我们获得了 $S_n$ 的最大熵的下界,并证明它在几个特殊情况下是紧的。除了这些特殊情况外,还提出了一个论证来支持以下猜想:边界代表所有 $n$、$r$ 的最佳值,即 $H(S_n)$ 在 $X_1、\dots、X_ 时最大化{n-1}$ 均匀分布在 $\{0, r\}$ 上,
更新日期:2021-11-09
down
wechat
bug