Multimodal Noise and Covering Initializations for GANsDownload PDF

03 May 2025 (modified: 19 Feb 2017)ICLR 2017Readers: Everyone
Abstract: This note describes two simple techniques to stabilize the training of Generative Adversarial Networks (GANs) on multimodal data. First, we propose a covering initialization for the generator. This initialization pre-trains the generator to match the empirical mean and covariance of its samples with those of the real training data. Second, we propose using multimodal input noise distributions. Our experiments reveal that the joint use of these two simple techniques stabilizes GAN training, and produces generators with a richer diversity of samples. Our code is available at http://pastebin.com/GmHxL0e8.
TL;DR: Using multimodal noise distributions and initializing the generator to cover all the data distribution stabilizes GAN training.
Conflicts: fb.com, inria.fr
Keywords: Deep learning, Unsupervised Learning, Optimization
3 Replies

Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview