Training Flow Matching: The Role of Weighting and Parameterization

Published: 03 Mar 2026, Last Modified: 05 Mar 2026ICLR 2026 DeLTa Workshop PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: flow matching, diffusion, design choices, parametrization, noise prediction, denoising, velocity
TL;DR: We disentangle the various factors that matter when training a flow matching or diffusion model
Abstract: We study the training objectives of denoising-based generative models, with a particular focus on loss weighting and output parameterization, including noise-, clean image-, and velocity-based formulations. Through a systematic numerical study, we analyze how these training choices interact with the intrinsic dimensionality of the data manifold, model architecture, and dataset size. Our experiments span synthetic datasets with controlled geometry as well as image data, and compare training objectives using quantitative metrics for denoising accuracy (PSNR across noise levels) and generative quality (FID). Rather than proposing a new method, our goal is to disentangle the various factors that matter when training a flow matching model, in order to provide practical insights on design choices.
Submission Number: 129
Loading