The Discretization Complexity Analysis of Consistency Models under Variance Exploding Forward Process

ICLR 2025 Conference Submission3617 Authors

24 Sept 2024 (modified: 19 Nov 2024)ICLR 2025 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Discretization Complexity, Consistency Model
Abstract: Consistency models, a new class of one-step generative models, have shown state-of-the-art performance in one-step generation and achieve competitive performance compared to multi-step diffusion models. The most challenging part of consistency models is the training process, which discretizes the diffusion process and trains a consistency function to map any point at any discretized timepoint of the diffusion process to the data distribution. Despite the empirical success, only a few works focus on the discretization complexity of consistency models. However, the setting of those works is far away from the empirical consistency models with good performance, suffers from large discretization complexity, and fails to explain the empirical success of consistency models. To bridge the gap between theory and application, we analyze consistency models with two key properties: (1) variance exploding forward process and (2) gradually decay discretization stepsize, which are both widely used in empirical consistency models. Under the above realistic setting, we make the first step to explain the empirical success of consistency models and achieve the state-of-the-art discretization complexity for consistency models, which is competitive with the results of diffusion models. After obtaining the results of the one-step sampling method of consistency models, we further analyze a multi-step consistency sampling algorithm proposed by \citet{song2023consistency} and show that this algorithm improves the discretization complexity compared with one-step generation, which matches the empirical observation.
Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3617
Loading