Rethinking the Noise Schedule of Diffusion-Based Generative Models

22 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Diffusion Model
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: In this work, we undertake both theoretical and empirical analysis of noise scheduling strategies within the scope of denoising diffusion generative models. We investigate the training noise schedule through the lens of power spectrum and introduce a novel metric, weighted signal-noise-ratio (WSNR), to uniformly represent the noise level in both RGB and latent spaces, enhancing the performance of high-resolution models in these spaces with WSNR-Equivalent training noise schedules. Further, we examine the reverse sampling process using the framework of Ordinary Differential Equations (ODEs), elucidating the concept of the optimal denoiser and providing insights into data-driven sampling noise schedules. We explore the correlation between the number of evaluation points and the generation quality to optimize the acceleration of the ODE solver in the diffusion model. Based on practical considerations of evaluation point effects, we propose an adaptive scheme to choose numerical methods within computational constraints, balancing efficacy and efficiency. Our approach, requiring no additional training, refines the FID of pre-trained CIFAR-10 and FFHQ-64 models from 1.92 and 2.45 to 1.89 and 2.25, respectively, utilizing 35 network evaluations per image.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5772
Loading