A Study on Sample Diversity in Generative Models: GANs vs. Diffusion ModelsDownload PDF

01 Mar 2023 (modified: 29 May 2023)Submitted to Tiny Papers @ ICLR 2023Readers: Everyone
Keywords: Generative Models, GAN, DDPM, Sample Diversity
TL;DR: The project compares GANs and DDPMs, showing that GANs have high-quality results but suffer from mode collapse, while DDPMs address this problem and demonstrate improved diversity.
Abstract: In this project, we compare the sample diversity of two generative models: Generative Adversarial Networks (GANs) and Denoising Diffusion Probabilistic Models (DDPMs). GANs have achieved impressive results in generating high-quality samples, but have been known to suffer from the issue of mode collapse, which can result in a lack of sample diversity. Mode collapse occurs when the generator network in a GAN becomes stuck in a local minimum, causing it to produce samples that are similar to each other rather than sampling from the full range of possibilities in the target distribution. This can lead to a lack of sample diversity, as the generator is unable to explore and represent the full range of features in the data. DDPMs, on the other hand, have demonstrated improved sample diversity compared to GANs. We conducted experiments using both synthetic and image data to explore the connection between mode collapse and sample diversity in these two frameworks. Our findings indicate that by addressing the mode collapse problem, DDPM preserves a comprehensive representation of the distribution.
8 Replies

Loading