Progressive Multistep Data-free Diffusion Distillation

ICLR 2026 Conference Submission16070 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Diffusion Models; Diffusion Distillation; Generative Models
Abstract: While one-step distillation achieves strong single-step generation, these methods are not inherently flexible for multi-step sampling. Efforts to adapt them beyond one step frequently lead to reliance on training data, poor generation quality at early intermediate steps, and significant computational demands. To overcome these limitations, we propose Progressive Multi-step Diffusion Distillation (PMDD), a unified framework that generalizes one-step distillation to the multi-step setting. PMDD adopts a recursive training strategy in which an N-step student is progressively refined into an N+1-step student with minimal finetuning. This process is enabled by a data-free sampling mechanism for generating intermediate states and an unforget loss that maintains quality across steps. Together, these innovations allow PMDD to match or surpass teacher fidelity with only a handful of function evaluations, while providing scalable, data-free training and substantially reduced computational overhead. Extensive experiments demonstrate that our method not only outperforms established few-step diffusion approaches but also exceeds teacher-level performance, achieving 1.99 FID on ImageNet $64\times64$ and 8.46 FID on zero-shot COCO $512\times512$, setting a new state of the art in multi-step data-free distillation with significantly lower resource demands.
Supplementary Material: pdf
Primary Area: generative models
Submission Number: 16070
Loading