Abstract: We introduce a Riemannian optimistic online learning algorithm for Hadamard manifolds based on inexact implicit updates. Unlike prior work, our method can handle in-manifold constraints, and matches the best known regret bounds in the Euclidean setting with no dependence on geometric constants, like the minimum curvature. Building on this, we develop algorithms for g-convex, g-concave smooth min-max problems on Hadamard manifolds. Notably, one method nearly matches the gradient oracle complexity of the lower bound for Euclidean problems, for the first time.
Lay Summary: Many problems, like training advanced machine learning models or analyzing complex data, require making decisions in spaces that aren't like our own. Imagine for instance a smooth surface shaped like a Pringle chip, stretching out endlessly in all directions (and maybe in a million dimensions).
These problems may require to take decisions within a specific region, like being limited to a marked area on the chip. Our research introduces new algorithms for learning and making decisions efficiently on these curved spaces, even when they have to follow extra rules about where they can go. Our approach overcomes challenges caused by these restrictions and are as fast as the fastest methods for spaces that are not curved.
This progress allows machine learning systems to handle a wider variety of real-world challenges, especially those where new information keeps arriving over time, or where it's unclear what future situations might look like.
Primary Area: Theory->Online Learning and Bandits
Keywords: Riemannian Optimization, Optimistic algorithms, Online Optimization, Min-Max optimization
Submission Number: 5017
Loading