Keywords: Diffusion Probabilistic Models, Exponential SDE methods, Image Generation, Generative Models
TL;DR: We propose a collection of versatile sampling SDE methods for Diffusion Models, with proven convergence guarantees, achieving optimal quality sampling faster than previous SDE methods.
Abstract: A potent class of generative models known as Diffusion Probabilistic Models
(DPMs) has become prominent. A forward diffusion process adds gradually noise
to data, while a model learns to gradually denoise. Sampling from pre-trained
DPMs is obtained by solving differential equations (DE) defined by the learnt
model, a process which has shown to be prohibitively slow. Numerous efforts on
speeding-up this process have consisted on crafting powerful ODE solvers.
Despite being quick, such solvers do not usually reach the optimal quality
achieved by available slow SDE solvers. Our goal is to propose SDE solvers that
reach optimal quality without requiring several hundreds or thousands of NFEs
to achieve that goal. We propose Stochastic Explicit Exponential
Derivative-free Solvers (SEEDS), improving and generalizing Exponential
Integrator approaches to the stochastic case on several frameworks.
After carefully analyzing the formulation of exact
solutions of diffusion SDEs, we craft SEEDS to analytically compute the linear
part of such solutions. Inspired by the Exponential Time-Differencing method,
SEEDS use a novel treatment of the stochastic components of solutions,
enabling the analytical computation of their variance, and contains high-order
terms allowing to reach optimal quality sampling $\sim3$-$5\times$ faster than previous
SDE methods. We validate our approach on several image generation benchmarks,
showing that SEEDS outperform or are competitive with previous SDE solvers.
Contrary to the latter, SEEDS are derivative and training free, and we fully
prove strong convergence guarantees for them.
Supplementary Material: zip
Submission Number: 14626
Loading