Generative Modeling by Estimating Gradients of the Data DistributionDownload PDF

Yang Song, Stefano Ermon

Sep 06, 2019 (edited Nov 05, 2019)NeurIPS 2019Readers: Everyone
  • Abstract: We explore a new class of generative models based on estimating the vector field of gradients of the data distribution using score matching, and employing Langevin dynamics to generate samples. To enable its application to real world data lying on low dimensional manifolds, we corrupt the data with different levels of random Gaussian noise and jointly estimate the gradients corresponding to all noise levels. For sampling, we propose an annealed Langevin dynamics approach where an annealing schedule of noise levels is used to guide sampling far from vs. close to the data manifold. Our framework allows flexible model architectures, requires no sampling during training or adversarial methods, and has a unified objective that can be used for model comparison. Our models produce samples with comparable fidelity to GANs on MNIST, CelebA and CIFAR-10 datasets. In particular, our model achieves the state-of-the-art inception score of 8.91 and a competitive FID score of 25.32 on CIFAR-10. Additionally, we demonstrate that our models learn effective representations via image inpainting experiments.
  • Code Link:
  • CMT Num: 6392
3 Replies