S\(^{2}\)-DMs: Skip-Step Diffusion Models

21 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Diffusion, DDIMs, DDPMs, Training algorithm
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Diffusion models have emerged as powerful generative tools, rivaling GANs in sample quality and mirroring the likelihood scores of autoregressive models. A subset of these models, exemplified by DDIMs, exhibit an inherent asymmetry: they are trained over $T$ steps but only sample from a subset of $T$ during generation. This selective sampling approach, though optimized for speed, inadvertently misses out on vital information from the unsampled steps, leading to potential compromises in sample quality. We refer to this phenomenon as ``asymmetric diffusion models". To address this issue, we present the S\(^{2}\)-DMs, which use an innovative $L_{skip}$, meticulously designed to reintegrate the information omitted during the selective sampling phase. The benefits of this approach are manifold: it notably enhances sample quality, is exceptionally simple to implement, necessitates minimal code modifications, and is flexible enough to be compatible with various sampling algorithms. The S\(^{2}\)-DMs achieves strong results on the CIFAR10 (32x32) and CelebA (64x64) datasets(e.g., FID scores of 8.01/6.41 in just 10 steps, surpassing the performance of DDIMs and PNDMs). Access to the code and additional resources is provided in material.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3479
Loading