CM-GAN: Stabilizing GAN Training with Consistency Models

Published: 19 Jun 2023, Last Modified: 28 Jul 20231st SPIGM @ ICML PosterEveryoneRevisionsBibTeX
Keywords: GAN, Consistency Model, Diffusion, Probability Flow ODE
Abstract: In recent years, generative adversarial networks (GANs) have gained attention for their ability to generate realistic images, despite being notoriously difficult to train. On the other hand, diffusion models have emerged as a promising alternative, offering stable training processes and avoiding mode collapse issues; however, their generation process is computationally expensive. To overcome this problem, Song et al. (2023) proposed consistency models (CMs) that are optimized through a novel consistency constraint induced by the underlying diffusion process. In this paper, we show that the same consistency constraint can be used to stabilize the training of GANs and alleviate the notorious mode collapse problem. In this way, we provide a method to combine the main strengths of diffusions and GANs while mitigating their major drawbacks. Additionally, as the technique can also be viewed as a method to fine-tune the consistency models using a discriminator, its performance is expected to outperform CM in general. We provide preliminary empirical results on MNIST to corroborate our claims.
Submission Number: 63
Loading