Computing all Optimal Partial TransportsDownload PDF

Published: 01 Feb 2023, Last Modified: 14 Apr 2023ICLR 2023 posterReaders: Everyone
Keywords: Optimal Transport, Combinatorial Optimization
Abstract: We consider the classical version of the optimal partial transport problem. Let $\mu$ (with a mass of $U$) and $\nu$ (with a mass of $S$) be two discrete mass distributions with $S \le U$ and let $n$ be the total number of points in the supports of $\mu$ and $\nu$. For a parameter $\alpha \in [0,S]$, consider the minimum-cost transport plan $\sigma_\alpha$ that transports a mass of $\alpha$ from $\nu$ to $\mu$. An \emph{OT-profile} captures the behavior of the cost of $\sigma_\alpha$ as $\alpha$ varies from $0$ to $S$. There is only limited work on OT-profile and its mathematical properties (see~\cite{figalli2010optimal}). In this paper, we present a novel framework to analyze the properties of the OT-profile and also present an algorithm to compute it. When $\mu$ and $\nu$ are discrete mass distributions, we show that the OT-profile is a piecewise-linear non-decreasing convex function. Let $K$ be the combinatorial complexity of this function, i.e., the number of line segments required to represent the OT-profile. Our exact algorithm computes the OT-profile in $\tilde{O}(n^2K)$ time. Given $\delta > 0$, we also show that the algorithm by ~\cite{lahn2019graph} can be used to $\delta$-approximate the OT-profile in $O(n^2/\delta + n/\delta^2)$ time. This approximation is a piecewise-linear function of a combinatorial complexity of $O(1/\delta)$. An OT-profile is arguably more valuable than the OT-cost itself and can be used within applications. Under a reasonable assumption of outliers, we also show that the first derivative of the OT-profile sees a noticeable rise before any of the mass from outliers is transported. By using this property, we get an improved prediction accuracy for an outlier detection experiment. We also use this property to predict labels and estimate the class priors within PU-Learning experiments. Both these experiments are conducted on real datasets.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Optimization (eg, convex and non-convex optimization)
Supplementary Material: zip
8 Replies