Decoupled MeanFlow: Turning Flow Models into Flow Maps for Accelerated Sampling

ICLR 2026 Conference Submission19869 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Few-step diffusion, Diffusion models, Flow-based models, generative models, diffusion transformer
Abstract: Denoising generative models, such as diffusion and flow-based models, produce high-quality samples but require many denoising steps due to discretization error. Flow maps, which estimate the average velocity between timesteps, mitigate this error and enable faster sampling. However, their training typically demands architectural changes that limit compatibility with pretrained flow models. We introduce Decoupled MeanFlow, a simple decoding strategy that converts flow models into flow map models without architectural modifications. Our method conditions the final blocks of diffusion transformers on the subsequent timestep, allowing pretrained flow models to be directly repurposed as flow maps. Combined with enhanced training techniques, this design enables high-quality generation in as few as 1–4 steps. Notably, we find that training flow models and subsequently converting them is more efficient and effective than training flow maps from scratch. On ImageNet 256$\times$256, our model attains a 1-step FID of 2.16, surpassing prior art by a large margin, and achieves a 4-step FID of 1.51, matching the performance of standard flow models while delivering over 125$\times$ faster inference.
Primary Area: generative models
Submission Number: 19869
Loading