On Distributed Lossy Coding of Symmetrically Correlated Gaussian SourcesDownload PDFOpen Website

2022 (modified: 17 Nov 2022)CISS 2022Readers: Everyone
Abstract: In this paper, we consider a distributed lossy compression network with <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$L$</tex> encoders and a decoder. Each encoder observes a source and sends a compressed version to the decoder. The decoder produces a joint reconstruction of target signals with the mean squared error distortion below a given threshold. It is assumed that the observed sources can be expressed as the sum of target signals and corruptive noises which are independently generated from two symmetric multivariate Gaussian distributions. We are interested in the minimum compression rate of this network versus the distortion threshold, which is known as the rate-distortion function. We derive a lower bound on the rate-distortion function by explicitly solving a max-min problem. Our lower bound matches the well-known Berger-Tung upper bound for some values of the distortion threshold. The asymptotic expressions of the upper and lower bounds are derived in the large <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$L$</tex> limit and are shown to coincide under specific constraints.
0 Replies

Loading