Keywords: Riemannian optimization, Acceleration, Wasserstein space
TL;DR: We propose a new class of vector transport based gradient descent enabling silver stepsize acceleration on Riemannian manifolds, yielding provable accelerated gradient methods for potential functional optimization in Wasserstein space.
Abstract: There is extensive literature on accelerating first-order optimization methods in an Euclidean setting. Under which conditions such acceleration is feasible in Riemannian optimization problems is an active area of research. Motivated by the recent success of silver stepsize methods in the Euclidean setting, we undertake a study of such algorithms in the Riemannian setting. We provide the new class of algorithms determined by the choice of vector transport that allows the silver stepsize acceleration on Riemannian manifolds for the function classes associated with the corresponding vector transport. As a core application, we show that our algorithm recovers the standard Wasserstein gradient descent on the 2-Wasserstein space and, as a result, provides the first provable accelerated gradient method for potential functional optimization problems in the Wasserstein space. In addition, we validate the numerical strength of the algorithm for standard benchmark tasks on the space of symmetric positive definite matrices.
Supplementary Material: zip
Primary Area: Optimization (e.g., convex and non-convex, stochastic, robust)
Submission Number: 17269
Loading