A Unified Framework for Consistency Generative Modeling

21 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Consistency models; Generative modeling;
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We provide a unified and scalable framework for consistency generation modeling.
Abstract: Consistency modeling, a novel generative paradigm inspired by diffusion models, has gained traction for its capacity to facilitate real-time generation through single-step sampling. While its advantages are evident, the understanding of its underlying principles and effective algorithmic enhancements remain elusive. In response, we present a unified framework for consistency generative modeling, without resorting to the predefined diffusion process. Instead, it directly constructs a probability density path that bridges the two distributions. Building upon this novel perspective, we introduce a more general consistency training objective that encapsulates previous consistency models and paves the way for innovative, consistency generation techniques. In particular, we introduce two novel models: Poisson Consistency Models (PCMs) and Coupling Consistency Models (CCMs), which extend the prior distribution of latent variables beyond the Gaussian form. This extension significantly augments the flexibility of generative modeling. Furthermore, we harness the principles of Optimal Transport (OT) to mitigate variance during consistency training, substantially improving convergence and generative quality. Extensive experiments on the generation of synthetic and real-world datasets, as well as image-to-image translation tasks (I2I), demonstrate the effectiveness of the proposed approaches.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3337
Loading