Improving Lossless Compression Rates via Monte Carlo Bits-Back CodingDownload PDF

Mar 04, 2021 (edited Apr 01, 2021)Neural Compression Workshop @ ICLR 2021Readers: Everyone
  • Keywords: Monte Carlo method, variational inference, neural compression, bits-back coding
  • TL;DR: Novel bits-back coding schemes derived from tighter variational bounds with improved lossless compression rates
  • Abstract: Latent variable models have been successfully applied in lossless compression with the bits-back coding algorithm. However, bits-back suffers from an increase in the bitrate equal to the KL divergence between the approximate posterior and the true posterior. In this paper, we show how to remove this gap asymptotically by deriving bits-back schemes from tighter variational bounds. The key idea is to exploit extended space representations of Monte Carlo estimators of the marginal likelihood. Naively applied, our schemes would require more initial bits than the standard bits-back coder, but we show how to drastically reduce this additional cost with couplings in the latent space. We demonstrate improved lossless compression rates in a variety of settings.
1 Reply

Loading