One-step Optimal Transport via Regularized Distribution Matching Distillation

19 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: optimal transport, unpaired translation, image-to-image translation, diffusion distillation
TL;DR: We propose RDMD: one-step optimal transport method that combines transport cost minimization with distribution matching and achieves comparable/superior faithfulness-realism trade-off compared to the baselines
Abstract: Unpaired domain translation remains a challenging task due to the need of finding a balance between faithfulness and realism. Diffusion-based methods for unpaired translation typically excel at realism, but require numerous inference steps and tend to offer suboptimal input-output alignment. Many of the optimal transport (OT) based methods, on the other hand, offer efficient few-step inference and reach superior input-output alignment, but heavily rely on adversarial training and inherit its shortcomings. In this paper, we propose a method called Regularized Distribution Matching Distillation (RDMD), which combines the best of both worlds. It replaces the adversarial training with diffusion-based distribution matching, addressing the typical shortcomings of OT methods and providing a strong initialization for the trained models. RDMD maintains the advantages of the OT methods by providing one-step inference and explicitly controlling the input-output faithfulness via regularization of the transport cost. We prove that in theory RDMD approximates the OT map and demonstrate its empirical performance on several tasks, including unpaired image-to-image translation in pixel and latent space and unpaired text detoxification. Empirical results show that RDMD achieves a comparable or better faithfulness-realism trade-off compared to the diffusion and OT-based baselines.
Primary Area: generative models
Submission Number: 21094
Loading