Diffusion Sampling Correction via Approximately 10 Parameters

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: We propose a new plug-and-play training paradigm for accelerated sampling of diffusion models, featuring minimal learnable parameters and training costs.
Abstract: While powerful for generation, Diffusion Probabilistic Models (DPMs) face slow sampling challenges, for which various distillation-based methods have been proposed. However, they typically require significant additional training costs and model parameter storage, limiting their practicality. In this work, we propose **P**CA-based **A**daptive **S**earch (PAS), which optimizes existing solvers for DPMs with minimal additional costs. Specifically, we first employ PCA to obtain a few basis vectors to span the high-dimensional sampling space, which enables us to learn just a set of coordinates to correct the sampling direction; furthermore, based on the observation that the cumulative truncation error exhibits an ``S"-shape, we design an adaptive search strategy that further enhances the sampling efficiency and reduces the number of stored parameters to approximately 10. Extensive experiments demonstrate that PAS can significantly enhance existing fast solvers in a plug-and-play manner with negligible costs. E.g., on CIFAR10, PAS optimizes DDIM's FID from 15.69 to 4.37 (NFE=10) using only **12 parameters and sub-minute training** on a single A100 GPU. Code is available at https://github.com/onefly123/PAS.
Lay Summary: While powerful for generation, diffusion models face slow sampling challenges. We propose **PAS**, a *plug-and-play* training paradigm designed to accelerate diffusion model sampling with *minimal costs*. PAS uses PCA to extract a few basis vectors to span the high-dimensional sampling space, allowing the correction of the sampling direction with only a set of coordinates. PAS also includes an adaptive search strategy to enhance sampling efficiency and reduce storage requirements. With only approximately 10 parameters and under an hour of training, PAS greatly improves sampling quality across various datasets, making diffusion models more practical for real-world applications.
Link To Code: https://github.com/onefly123/PAS
Primary Area: Deep Learning->Generative Models and Autoencoders
Keywords: diffusion models, accelerated sampling, low-cost training, plug-and-play
Submission Number: 8569
Loading