Multimodal Noise and Covering Initializations for GANsDownload PDF

26 Apr 2024 (modified: 19 Feb 2017)ICLR 2017 workshop submissionReaders: Everyone
Abstract: This note describes two simple techniques to stabilize the training of Generative Adversarial Networks (GANs) on multimodal data. First, we propose a covering initialization for the generator. This initialization pre-trains the generator to match the empirical mean and covariance of its samples with those of the real training data. Second, we propose using multimodal input noise distributions. Our experiments reveal that the joint use of these two simple techniques stabilizes GAN training, and produces generators with a richer diversity of samples. Our code is available at http://pastebin.com/GmHxL0e8.
TL;DR: Using multimodal noise distributions and initializing the generator to cover all the data distribution stabilizes GAN training.
Conflicts: fb.com, inria.fr
Keywords: Deep learning, Unsupervised Learning, Optimization
3 Replies

Loading