Metric-Projected Accelerated Riemannian Optimization: Handling Constraints to Bound Geometric PenaltiesDownload PDF

16 May 2022 (modified: 05 May 2023)NeurIPS 2022 SubmittedReaders: Everyone
Keywords: riemannian optimization, acceleration, constrained, proximal methods
TL;DR: We propose accelerated first-order methods for Riemannian optimization using a metric projected oracle. We can work without undesirable assumptions of previous accelerated works
Abstract: We propose an accelerated first-order method for the optimization of smooth and (strongly or not) geodesically-convex functions over a compact and geodesically-convex set in Hadamard manifolds, that we access to via a metric-projection oracle. It enjoys the same rates of convergence as Nesterov's accelerated gradient descent, up to a multiplicative geometric penalty and log factors. Even without in-manifold constraints, all prior fully accelerated works require their iterates to remain in some specified compact set (which is needed in worse-case analyses due to a lower bound), while only two previous methods are able to enforce this condition and these, in contrast, have limited applicability like to local optimization or to spaces of constant curvature. Our results solve an open question in (Kim and Yang, 2022) and an another question related to one posed in (Zhang and Sra, 2016). In our solution, we show we can use projected Riemannian gradient descent to implement an inexact proximal point operator that we use as a subroutine, which is of independent interest.
Supplementary Material: pdf
18 Replies

Loading