当前位置: X-MOL 学术arXiv.cs.IT › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Neural Distributed Source Coding
arXiv - CS - Information Theory Pub Date : 2021-06-05 , DOI: arxiv-2106.02797
Jay Whang, Anish Acharya, Hyeji Kim, Alexandros G. Dimakis

Distributed source coding is the task of encoding an input in the absence of correlated side information that is only available to the decoder. Remarkably, Slepian and Wolf showed in 1973 that an encoder that has no access to the correlated side information can asymptotically achieve the same compression rate as when the side information is available at both the encoder and the decoder. While there is significant prior work on this topic in information theory, practical distributed source coding has been limited to synthetic datasets and specific correlation structures. Here we present a general framework for lossy distributed source coding that is agnostic to the correlation structure and can scale to high dimensions. Rather than relying on hand-crafted source-modeling, our method utilizes a powerful conditional deep generative model to learn the distributed encoder and decoder. We evaluate our method on realistic high-dimensional datasets and show substantial improvements in distributed compression performance.

中文翻译:

神经分布式源编码

分布式源编码的任务是在没有仅解码器可用的相关边信息的情况下对输入进行编码。值得注意的是,Slepian 和 Wolf 在 1973 年表明,无法访问相关边信息的编码器可以渐近地实现与边信息在编码器和解码器都可用时相同的压缩率。虽然在信息论中关于这个主题有重要的先验工作,但实用的分布式源编码仅限于合成数据集和特定的相关结构。在这里,我们提出了一个用于有损分布式源编码的通用框架,该框架与相关结构无关,并且可以扩展到高维。而不是依靠手工制作的源建模,我们的方法利用强大的条件深度生成模型来学习分布式编码器和解码器。我们在真实的高维数据集上评估了我们的方法,并显示了分布式压缩性能的实质性改进。
更新日期:2021-06-08
down
wechat
bug