Lossless Compression using Continuously-Indexed Normalizing FlowsDownload PDF

Published: 01 Apr 2021, Last Modified: 05 May 2023Neural Compression Workshop @ ICLR 2021Readers: Everyone
Keywords: lossless, bits-back, compression
TL;DR: lossless compression using Continuously-Indexed Normalizing Flows
Abstract: Recently, a class of deep generative models known as continuously-indexed flows (CIFs) have expanding the modelling capacity of normalizing flows (NFs) in the context of both density estimation and variational inference. CIFs are provably more general and expressive than NFs, but do not induce a closed-form density model and thus require additional considerations when applying in the same contexts that NFs have shown promise. One such area is lossless compression, where NFs have been used as the density model to develop a compression scheme, known as local bits-back, with expected codelength approximately equal to the average negative log-likelihood of the NF density model. Here, we propose to extend the local bits-back scheme to CIF-based density models, as the improved expressiveness inherent in CIFs stands to reduce the expected codelength of compressed data. We also leverage recent work on compression schemes built with hierarchical variational auto-encoders -- as hierarchical CIFs can themselves be seen as interpolating between these and NFs -- gaining further expressiveness in our density models and effectiveness in our compression scheme.
1 Reply

Loading