Continuous Conditional Generative Adversarial Networks (cGAN) with Generator RegularizationDownload PDFOpen Website

Published: 01 Jan 2021, Last Modified: 13 May 2023CoRR 2021Readers: Everyone
Abstract: Conditional Generative Adversarial Networks are known to be difficult to train, especially when the conditions are continuous and high-dimensional. To partially alleviate this difficulty, we propose a simple generator regularization term on the GAN generator loss in the form of Lipschitz penalty. Thus, when the generator is fed with neighboring conditions in the continuous space, the regularization term will leverage the neighbor information and push the generator to generate samples that have similar conditional distributions for each neighboring condition. We analyze the effect of the proposed regularization term and demonstrate its robust performance on a range of synthetic and real-world tasks.
0 Replies

Loading