DiffFlow: A Unified SDE for Score-Based Diffusion Models and Generative Adversarial Networks

22 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: generative adversarial networks, diffusion models, stochastic differential equations
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: Design a novel SDE that unifies GANs and diffusion models.
Abstract: Generative models can be categorized into two types: i) explicit generative models that define explicit density forms and allow exact likelihood inference, such as score-based diffusion models (SDMs) and normalizing flows; ii) implicit generative models that directly learn a transformation from the prior to the data distribution, such as generative adversarial nets (GANs). While these two types of models have shown great success, they suffer from respective limitations that hinder them from achieving fast sampling and high sample quality simultaneously. In this paper, we propose a unified theoretic framework for SDMs and GANs. We mainly show that: i) the learning dynamics of both SDMs and GANs can be described as a novel SDE named **D**iscriminator Deno**i**sing Di**ff**usion **Flow** (**DiffFlow**), where the drift can be determined by some weighted combinations of scores of the real data and the generated data; ii) By adjusting the relative weights between different score terms, we can obtain a smooth transition between SDMs and GANs while the marginal distribution of the SDE remains invariant to the change of the weights; iii) we prove the asymptotic and non-asymptotic convergence of the continuous SDE dynamics of DiffFlow by some weak isoperimetry of the smoothed target distribution; iv) under our unified theoretic framework, we introduce several instantiations of DiffFlow that incorporate some recently proposed hybrid algorithms of GAN and diffusion models, for instance, the TDPM (Zheng et al., 2022) as a special case. Our framework unifies GANs and SDMs into a continuous spectrum. Hence, it offers the potential to design new generative learning algorithms that could achieve a flexible trade-off between high sample quality and fast sampling speed beyond existing GAN- and/or SDM-like algorithms.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5518
Loading