Local and Unbalanced Optimal Transport for Feature Learning with Probabilistic Guarantees

17 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: optimal transport, feature learning, generalization bound
Abstract: This paper explores the local and unbalanced optimal transport for feature learning in an embedding space. Instead of using joint distributions of data, we introduce conditional distributions in terms of the Kullback-Leibler (KL) divergence where some reference conditional distributions are utilized. Using conditional distributions provides the flexibility in controlling the transferring range of given data points. When the block coordinate descent method is employed to solve our model, it is interesting to find that conditional and marginal distributions have closed-form solutions. Moreover, the use of conditional distributions facilitates the derivation of the generalization bound of our model via the Rademacher complexity, which characterizes its convergence speed in terms of the number of samples. By optimizing the anchors (centroids) defined in the model, we also employ the unbalanced optimal transport and autoencoders to explore an embedding space of samples in the clustering problem. In the experimental part, we demonstrate that the proposed model achieves promising performance on some learning tasks. Moreover, we construct a local and unbalanced optimal transport classifier to classify set-valued objects.
Supplementary Material: zip
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 8818
Loading