当前位置: X-MOL 学术Commun. Stat. Theory Methods › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
An extension of entropy power inequality for dependent random variables
Communications in Statistics - Theory and Methods ( IF 0.6 ) Pub Date : 2020-09-02 , DOI: 10.1080/03610926.2020.1813305
Fatemeh Asgari 1 , Mohammad Hossein Alamatsaz 1
Affiliation  

Abstract

The entropy power inequality (EPI) for convolution of two independent random variables was first proposed by Shannon, C. E., 1948. However, in practice, there are many situations in which the involved random variables are not independent. In this article, considering additive noise channels, it is shown that, under some conditions, EPI holds for the case when the involved random variables are dependent. In order to achieve our main result, meanwhile a lower bound for the Fisher information of the output signal is obtained which is useful on its own. An example is also provided to illustrate our result.



中文翻译:

因随机变量的熵幂不等式的扩展

摘要

两个独立随机变量卷积的熵幂不等式(EPI)最早由 Shannon, CE, 1948 提出。然而,在实践中,有很多情况涉及的随机变量不是独立的。在本文中,考虑到加性噪声通道,表明在某些条件下,EPI 适用于所涉及的随机变量相关的情况。为了实现我们的主要结果,同时获得了输出信号的Fisher信息的下界,这本身就很有用。还提供了一个例子来说明我们的结果。

更新日期:2020-09-02
down
wechat
bug