Warm Starts Accelerate Conditional Diffusion

18 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: diffusion, efficient diffusion, conditional diffusion, inpainting, image inpainting, weather forecasting, probabilistic modelling, probabilistic forecasting
Abstract: Generative models like diffusion and flow-matching create high-fidelity samples by progressively refining noise. The refinement process is notoriously slow, often requiring hundreds of function evaluations. We introduce Warm-Start Diffusion (WSD), a method that uses a simple, deterministic model to dramatically accelerate conditional generation by providing a better starting point. Instead of starting generation from an uninformed $\mathcal{N}(\mathbf{0}, I)$ prior, our deterministic warm-start model predicts an informed prior $\mathcal{N}(\hat{\boldsymbol{\mu}}_C, \text{diag}(\hat{\boldsymbol{\sigma}}^2_C))$, whose moments are conditioned on the input context $C$. This warm start substantially reduces the distance the generative process must traverse, and therefore the number of diffusion steps required when the context $C$ is strongly informative. WSD is applicable to any standard diffusion or flow matching method, is orthogonal to and synergistic with other fast sampling techniques like efficient solvers, and is simple to implement. We test WSD in a variety of settings, and find that it substantially outperforms standard diffusion in the efficient sampling regime, generating realistic samples using only 4-6 function evaluations, and saturating performance with 10-12.
Supplementary Material: zip
Primary Area: generative models
Submission Number: 11215
Loading