Keywords: min-max optimization, riemannian optimization, robust manifold, robust PCA, Geodesic-convex-concave function
TL;DR: We prove the iterative performance of riemannian corrected extra gradient and gradient descent-ascent for min-max geodesically convex-concave manifold problems
Abstract: From optimal transport to robust dimensionality reduction, many machine learning applications
can be cast into the min-max optimization problems over Riemannian manifolds. Though many
min-max algorithms have been analyzed in the Euclidean setting, it has been elusive how these
results translate to the Riemannian case. Zhang et al. (2022) have recently identified that geodesic convex
concave Riemannian problems admit always Sion’s saddle point solutions. Immediately, an important
question that arises is if a performance gap between the Riemannian and the optimal Euclidean space
convex concave algorithms is necessary. Our work is the first to answer the question in the negative:
We prove that the Riemannian corrected extragradient (RCEG) method achieves last-iterate at a
linear convergence rate at the geodesically strongly convex concave case, matching the euclidean one.
Our results also extend to the stochastic or non-smooth case where RCEG & Riemanian gradient
ascent descent (RGDA) achieve respectively near-optimal convergence rates up to factors depending
on curvature of the manifold. Finally, we empirically demonstrate the effectiveness of RCEG in
solving robust PCA.
Supplementary Material: pdf
16 Replies
Loading