Lower Bounding Rate-Distortion From SamplesDownload PDF

Mar 04, 2021 (edited Aug 05, 2021)Neural Compression Workshop @ ICLR 2021Readers: Everyone
  • Keywords: rate-distortion, lossy compression, stochastic optimization, information theory, deep generative modeling
  • TL;DR: We propose a stochastic optimization algorithm for computing a lower bound of the rate-distortion function of a general data source, only requiring its samples; we give asymptotic guarantees on the correctness of the bound.
  • Abstract: The rate-distortion function tells us the minimal number of bits on average to compress a random object within a given distortion tolerance. A lower bound on the rate-distortion function therefore represents a fundamental limit on the best possible rate-distortion performance of any lossy compression algorithm, and can help us assess the potential room for improvement. We make a first attempt at an algorithm for computing such a lower bound, applicable to general data sources that we have samples of. Based on a dual characterization of the rate-distortion function in terms of a constrained maximization problem, our method approximates the exact constraint function by an asymptotically unbiased estimator, allowing for stochastic optimization. On a 2D Gaussian source, we obtain a lower bound within 1 bit of the analytical rate-distortion function.
1 Reply