Mind Your Step: Continuous Conditional GANs with Generator RegularizationDownload PDF

03 Oct 2022 (modified: 06 Dec 2023)Neurips 2022 SyntheticData4MLReaders: Everyone
Keywords: conditional gan, time series generation
TL;DR: We propose a simple generator regularization term on the GAN generator loss for continuous conditions and applied it to time series generation.
Abstract: Conditional Generative Adversarial Networks are known to be difficult to train, especially when the conditions are continuous and high-dimensional. To partially alleviate this difficulty, we propose a simple generator regularization term on the GAN generator loss in the form of a Lipschitz penalty. The intuition of this Lipschitz penalty is that, when the generator is fed with neighboring conditions in the continuous space, the regularization term will leverage the neighbor information and push the generator to generate samples that have similar conditional distributions for neighboring conditions. We analyze the effect of the proposed regularization term and demonstrate its robust performance on a range of synthetic tasks as well as real-world conditional time series generation tasks.
3 Replies

Loading