Reproducibility Challenge – Generative Modeling by Estimating Gradients of the Data DistributionDownload PDF

Published: 20 Feb 2020, Last Modified: 05 May 2023NeurIPS 2019 Reproducibility Challenge Blind ReportReaders: Everyone
Abstract: In this project we attempt to reproduce results from the paper Generative Modeling by Estimating Gradients of the Data Distribution by Song & Ermon (2019). The authors propose a novel generative framework based solely on gradients of data density estimated by a neural network. Once the model is trained, sampling can be performed with annealed Langevin dynamics. While we managed to reproduce the experiments qualitatively, we failed to achieve comparable results for Inception and FID scores for CIFAR-10. We further extended the original work in various directions (computing FID and IS also for CelebA, investigation of the sampling hyperparameters ε and T , linear instead of geometric annealing schedule for noise levels, and different network architecture).
Track: Replicability
NeurIPS Paper Id: /forum?id=B1lcYrBgLH
5 Replies

Loading