Averaging on the Bures-Wasserstein manifold: dimension-free convergence of gradient descentDownload PDF

21 May 2021, 20:48 (edited 21 Jan 2022)NeurIPS 2021 SpotlightReaders: Everyone
  • Keywords: Bures-Wasserstein barycenter, dimension-free convergence, entropic regularization, first-order optimization, geometric median, non-convex optimization, Riemannian optimization
  • TL;DR: We improve state-of-the-art convergence guarantees for Riemannian gradient descent for computing geometric averages of Gaussians.
  • Abstract: We study first-order optimization algorithms for computing the barycenter of Gaussian distributions with respect to the optimal transport metric. Although the objective is geodesically non-convex, Riemannian gradient descent empirically converges rapidly, in fact faster than off-the-shelf methods such as Euclidean gradient descent and SDP solvers. This stands in stark contrast to the best-known theoretical results, which depend exponentially on the dimension. In this work, we prove new geodesic convexity results which provide stronger control of the iterates, yielding a dimension-free convergence rate. Our techniques also enable the analysis of two related notions of averaging, the entropically-regularized barycenter and the geometric median, providing the first convergence guarantees for these problems.
  • Supplementary Material: pdf
  • Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
  • Code: https://github.com/PatrikGerber/Bures-Barycenters
11 Replies

Loading