On Riemannian Optimization over Positive Definite Matrices with the Bures-Wasserstein GeometryDownload PDF

Published: 09 Nov 2021, Last Modified: 14 Jul 2024NeurIPS 2021 PosterReaders: Everyone
Keywords: Riemannian optimization, Riemannian manifold, Symmetric Positive Definite, Bures-Wasserstein, Affine-Invariant, geodesic convexity, non-negative curvature
Abstract: In this paper, we comparatively analyze the Bures-Wasserstein (BW) geometry with the popular Affine-Invariant (AI) geometry for Riemannian optimization on the symmetric positive definite (SPD) matrix manifold. Our study begins with an observation that the BW metric has a linear dependence on SPD matrices in contrast to the quadratic dependence of the AI metric. We build on this to show that the BW metric is a more suitable and robust choice for several Riemannian optimization problems over ill-conditioned SPD matrices. We show that the BW geometry has a non-negative curvature, which further improves convergence rates of algorithms over the non-positively curved AI geometry. Finally, we verify that several popular cost functions, which are known to be geodesic convex under the AI geometry, are also geodesic convex under the BW geometry. Extensive experiments on various applications support our findings.
Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
TL;DR: An analysis comparing the different Riemannian metrics for optimizing cost functions defined on the Riemannian symmetric positive definite manifold.
Code: https://github.com/andyjm3/AI-vs-BW
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/on-riemannian-optimization-over-positive/code)
12 Replies

Loading