Universal Neural Optimal Transport

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: We train a neural network to accurately predict optimal transport distances across datasets and dimensions with neural operators.
Abstract: Optimal Transport (OT) problems are a cornerstone of many applications, but solving them is computationally expensive. To address this problem, we propose UNOT (Universal Neural Optimal Transport), a novel framework capable of accurately predicting (entropic) OT distances and plans between discrete measures of variable resolution for a given cost function. UNOT builds on Fourier Neural Operators, a universal class of neural networks that map between function spaces and that are discretization-invariant, which enables our network to process measures of varying sizes. The network is trained adversarially using a second, generating network and a self-supervised bootstrapping loss. We theoretically justify the use of FNOs, prove that our generator is universal, and that minimizing the bootstrapping loss provably minimizes the ground truth loss. Through extensive experiments, we show that our network not only accurately predicts optimal transport distances and plans across a wide range of datasets, but also captures the geometry of the Wasserstein space correctly. Furthermore, we show that our network can be used as a state-of-the-art initialization for the Sinkhorn algorithm, significantly outperforming existing approaches.
Lay Summary: Optimal Transport is an area of mathematics that plays an important role in many applications, spanning fields such as artificial intelligence (AI), logistics, biology, physics, seismology, predicting climate and weather patterns, or economics. Many of these applications involve computing so-called Optimal Transport distances. These distances can be thought of as how expensive it is to move something from one arrangement or location to another. They can be used to compute the similarity between objects such as delivery trucks and houses, comparing two images or paintings, comparing audio recordings, cloud or smoke patterns, weather patterns, biological cell processes, predictions made by AI algorithms, and much more. However, computing these distances is oftentimes very expensive, even for computers. We propose a new machine learning method which can be used to predict these distances more quickly. Importantly, we construct our method in such a way that, unlike previous methods, it can be applied more broadly to different tasks and also to objects of different sizes.
Link To Code: https://github.com/GregorKornhardt/UNOT
Primary Area: General Machine Learning->Unsupervised and Semi-supervised Learning
Keywords: Optimal Transport, Neural Operators, Meta Learning, Adversarial Training
Submission Number: 13626
Loading