Keywords: Deterministic sampling, flow-based transport
Abstract: We study the problem of transport-based sampling for target distributions that are implicitly defined as minimizers of objective functions. This formulation generalizes existing approaches that rely on learning time-varying scores under specific divergences, such as KL minimization. Recent advances in score-based transport methods highlight several advantages, including smooth deterministic trajectories and monotone, noise-free convergence compared to Langevin dynamics. Motivated by these benefits, we develop a stochastic Wasserstein gradient flow framework, in which particle-based estimators approximate the Wasserstein gradient and transport an arbitrary initial distribution toward the target. We establish convergence analysis that account for the mean and variance of these stochastic gradient estimates. We further demonstrate applications to multi-objective optimization and particle transport, leveraging maximum mean discrepancy and Wasserstein distance as guiding metrics.
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Submission Number: 24171
Loading