Abstract: We present a neural framework for learning conditional optimal transport (OT) maps between probability distributions. Conditional OT maps---transformations that adapt based on auxiliary variables such as labels, time indices, or other parameters---are essential for applications ranging from generative modeling to uncertainty quantification of black-box models. However, existing conditional OT methods face significant limitations: input convex neural networks (ICNNs) impose severe architectural constraints that reduce expressivity, while simpler conditioning strategies like concatenation fail to model fundamentally different transport behaviors across conditions. Our approach introduces a conditioning mechanism capable of simultaneously processing both categorical and continuous conditioning variables, using learnable embeddings and positional encoding. At the core of our method lies a hypernetwork that generates transport layer parameters based on these inputs, creating adaptive mappings that outperform simpler conditioning methods. We showcase the framework's practical impact through applications to global sensitivity analysis, enabling efficient computation of OT-based sensitivity indices for complex black-box models. This work advances the state-of-the-art in conditional optimal transport, enabling broader application of optimal transport principles to complex, high-dimensional domains such as generative modeling, black-box model explainability, and scientific computing.
Submission Type: Long submission (more than 12 pages of main content)
Assigned Action Editor: ~Emmanuel_Bengio1
Submission Number: 5962
Loading