Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Multimodal Noise and Covering Initializations for GANs
David Lopez-Paz, Maxime Oquab
Feb 17, 2017 (modified: Feb 19, 2017)ICLR 2017 workshop submissionreaders: everyone
Abstract:This note describes two simple techniques to stabilize the training of Generative Adversarial Networks (GANs) on multimodal data. First, we propose a covering initialization for the generator. This initialization pre-trains the generator to match the empirical mean and covariance of its samples with those of the real training data. Second, we propose using multimodal input noise distributions. Our experiments reveal that the joint use of these two simple techniques stabilizes GAN training, and produces generators with a richer diversity of samples. Our code is available at http://pastebin.com/GmHxL0e8.
TL;DR:Using multimodal noise distributions and initializing the generator to cover all the data distribution stabilizes GAN training.