Improving Lossless Compression Rates via Monte Carlo Bits-Back CodingDownload PDF

Published: 01 Apr 2021, Last Modified: 22 Oct 2023Neural Compression Workshop @ ICLR 2021Readers: Everyone
Keywords: Monte Carlo method, variational inference, neural compression, bits-back coding
TL;DR: Novel bits-back coding schemes derived from tighter variational bounds with improved lossless compression rates
Abstract: Latent variable models have been successfully applied in lossless compression with the bits-back coding algorithm. However, bits-back suffers from an increase in the bitrate equal to the KL divergence between the approximate posterior and the true posterior. In this paper, we show how to remove this gap asymptotically by deriving bits-back schemes from tighter variational bounds. The key idea is to exploit extended space representations of Monte Carlo estimators of the marginal likelihood. Naively applied, our schemes would require more initial bits than the standard bits-back coder, but we show how to drastically reduce this additional cost with couplings in the latent space. We demonstrate improved lossless compression rates in a variety of settings.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2102.11086/code)
1 Reply

Loading