Keywords: diffusion, generative modeling, data compression
TL;DR: We make an effort towards practical compression with an unconditional diffusion model, by simulating uniform-noise channels instead of Gaussian channels.
Abstract: Diffusion probabilistic models have achieved success in many generative modeling tasks, from image generation to inverse problem solving.
A distinct feature of these models is that they correspond to deep hierarchical latent variable models optimizing a variational evidence lower bound (ELBO) on the data likelihood.
Drawing on a basic connection between likelihood modeling and compression, we explore the potential of diffusion models for progressive coding, resulting in a sequence of bits that can be incrementally transmitted and decoded with progressively improving reconstruction quality.
Unlike prior work based on Gaussian diffusion or conditional diffusion models, we propose a new form of diffusion model with uniform noise in the forward process, whose negative ELBO corresponds to the end-to-end compression cost using universal quantization.
We obtain promising first results on image compression, achieving competitive rate-distortion and rate-realism results on a wide range of bit-rates with a single model.
Submission Number: 97
Loading