Neural Distributed Compressor Does Binning

Published: 11 Jul 2023, Last Modified: 11 Jul 2023NCW ICML 2023EveryoneRevisionsBibTeX
Keywords: Distributed source coding, machine learning, Wyner-Ziv coding
TL;DR: We demonstrate that neural distributed compressor mimics the Wyner-Ziv theorem in network information theory although no particular structure was imposed onto the model.
Abstract: We consider lossy compression of an information source when the decoder has lossless access to a correlated one. This setup, also known as the _Wyner-Ziv_ problem in information theory, is a special case of distributed source coding. To this day, real-world applications of this problem have neither been fully developed nor heavily investigated. We find that our neural network-based compression scheme re-discovers some principles of the optimum theoretical solution of the Wyner-Ziv setup, such as _binning_ in the source space as well as linear decoder behavior within each quantization index, for the quadratic-Gaussian case. Binning is a widely used tool in information theoretic proofs and methods, and to our knowledge, this is the first time it has been explicitly observed to emerge from data-driven learning.
Submission Number: 6
Loading