Optimizing Few-Step Diffusion Samplers by Gradient DescentDownload PDF

Anonymous

Sep 29, 2021 (edited Oct 06, 2021)ICLR 2022 Conference Blind SubmissionReaders: Everyone
  • Abstract: Denoising Diffusion Probabilistic Models (DDPMs) have emerged as a flexible family of generative models rivaling GANs and autoregressive models in sample quality and likelihoods. DDPMs however typically require hundreds of inference steps to generate a high-fidelity image, despite recent progress on speeding up diffusion model sampling. We introduce Differentiable Diffusion Sampler Search (DDSS): a method that learns few-step samplers for any pre-trained DDPM by using gradient descent. We propose Generalized Gaussian Diffusion Processes (GGDP), a family of non-Markovian samplers for diffusion models, and we show how to improve the generated samples of pre-trained DDPMs by optimizing the degrees of freedom of the GGDP sampler family with respect to a perceptual loss. Our optimization procedure backpropagates through the sampling process using the reparameterization trick. Searching our novel GGDP family with DDSS, we achieve strong results on unconditional image generation on both CIFAR and ImageNet 64x64 (e.g., 7.59 on CIFAR10 with only 10 inference steps, and 4.67 with 25 steps, compared to 13.62 and 6.56 with the strongest respective DDIM($\eta=0$) baselines). Our method is compatible with any pre-trained DDPM without re-training, only needs to be applied once, and does not finetune the parameters of the pre-trained DDPM.
  • One-sentence Summary: We propose a method to discover fast, high-fidelity samplers for diffusion probabilistic models.
0 Replies

Loading