当前位置: X-MOL 学术IEEE Trans. Inform. Theory › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Mutual Information and Optimality of Approximate Message-Passing in Random Linear Estimation
IEEE Transactions on Information Theory ( IF 2.2 ) Pub Date : 2020-04-28 , DOI: 10.1109/tit.2020.2990880
Jean Barbier , Nicolas Macris , Mohamad Dia , Florent Krzakala

We consider the estimation of a signal from the knowledge of its noisy linear random Gaussian projections. A few examples where this problem is relevant are compressed sensing, sparse superposition codes, and code division multiple access. There has been a number of works considering the mutual information for this problem using the replica method from statistical physics. Here we put these considerations on a firm rigorous basis. First, we show, using a Guerra-Toninelli type interpolation, that the replica formula yields an upper bound to the exact mutual information. Secondly, for many relevant practical cases, we present a converse lower bound via a method that uses spatial coupling, state evolution analysis and the I-MMSE theorem. This yields a single letter formula for the mutual information and the minimal-mean-square error for random Gaussian linear estimation of all discrete bounded signals. In addition, we prove that the low complexity approximate message-passing algorithm is optimal outside of the so-called hard phase, in the sense that it asymptotically reaches the minimal-mean-square error. In this work spatial coupling is used primarily as a proof technique. However our results also prove two important features of spatially coupled noisy linear random Gaussian estimation. First there is no algorithmically hard phase. This means that for such systems approximate message-passing always reaches the minimal-mean-square error. Secondly, in the limit of infinitely long coupled chain, the mutual information associated to spatially coupled systems is the same as the one of uncoupled linear random Gaussian estimation.

中文翻译:


随机线性估计中近似消息传递的互信息和最优性



我们考虑根据其噪声线性随机高斯投影的知识来估计信号。与此问题相关的一些示例是压缩感知、稀疏叠加码和码分多址。已经有许多工作使用统计物理学的复制方法来考虑该问题的互信息。在这里,我们将这些考虑建立在坚实严格的基础上。首先,我们使用 Guerra-Toninelli 类型插值来证明复制公式产生精确互信息的上限。其次,针对许多相关的实际案例,我们通过空间耦合、状态演化分析和I-MMSE定理的方法提出了逆下界。这产生了用于所有离散有界信号的随机高斯线性估计的互信息和最小均方误差的单字母公式。此外,我们证明低复杂度近似消息传递算法在所谓的困难阶段之外是最优的,从某种意义上说,它渐近达到最小均方误差。在这项工作中,空间耦合主要用作证明技术。然而,我们的结果也证明了空间耦合噪声线性随机高斯估计的两个重要特征。首先,没有算法上的困难阶段。这意味着对于此类系统,近似消息传递始终达到最小均方误差。其次,在无限长耦合链的限制下,空间耦合系统的互信息与非耦合线性随机高斯估计的互信息相同。
更新日期:2020-04-28
down
wechat
bug