Stochastic Optimization in Semi-Discrete Optimal Transport: Convergence Analysis and Minimax Rate

Published: 17 Sept 2025, Last Modified: 28 Sept 2025NeurIPS 2025EveryoneRevisionsCC BY 4.0
Abstract: We investigate the semi-discrete Optimal Transport (OT) problem, where a continuous source measure $\mu$ is transported to a discrete target measure $\nu$, with particular attention to the OT map approximation. In this setting, Stochastic Gradient Descent (SGD) based solvers have demonstrated strong empirical performance in recent machine learning applications, yet their theoretical guarantee to approximate the OT map is an open question. In this work, we answer it positively by providing both computational and statistical convergence guarantees of SGD. Specifically, we show that SGD methods can estimate the OT map with a minimax convergence rate of $\mathcal{O}(1/\sqrt{n})$, where $n$ is the number of samples drawn from $\mu$. To establish this result, we study the averaged projected SGD algorithm, and identify a suitable projection set that contains a minimizer of the objective, even when the source measure is not compactly supported. Our analysis holds under mild assumptions on the source measure and applies to MTW cost functions,whic include $\|\cdot\|^p$ for $p \in (1, \infty)$. We finally provide numerical evidence for our theoretical results.
Loading