Learning representations that are closed-form Monge mapping optimal with application to domain adaptation

Published: 01 Aug 2023, Last Modified: 01 Aug 2023Accepted by TMLREveryoneRevisionsBibTeX
Abstract: Optimal transport (OT) is a powerful geometric tool used to compare and align probability measures following the least effort principle. Despite its widespread use in machine learning (ML), OT problem still bears its computational burden, while at the same time suffering from the curse of dimensionality for measures supported on general high-dimensional spaces. In this paper, we propose to tackle these challenges using representation learning. In particular, we seek to learn an embedding space such that the samples of the two input measures become alignable in it with a simple affine mapping that can be calculated efficiently in closed-form. We then show that such approach leads to results that are comparable to solving the original OT problem when applied to the transfer learning task on which many OT baselines where previously evaluated in both homogeneous and heterogeneous DA settings.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: As requested by the reviewers, we did the following changes: 1. The title has changed to emphasize more on the OT contribution rather than on DA one. 2. Abstract and contributions parts of the paper were rewritten with the new positioning. 3. Preliminary knowledge and proposed contributions are more centered around the OT part. 4. Added comparison on large-scale VisDA-2017 dataset with JUMBOT (Fatras et al. ICML'21). 5. Added a phrase on how shrinking can be used to obtain better estimates of the covariance matrix for small batch size.
Code: https://github.com/Oleffa/LaOT
Supplementary Material: zip
Assigned Action Editor: ~Hanie_Sedghi1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 1138
Loading