Accelerated Single-Call Methods for Constrained Min-Max OptimizationDownload PDF

Published: 23 Nov 2022, Last Modified: 26 Mar 2024OPT 2022 PosterReaders: Everyone
Keywords: min-max optimization, nonconvex-nonconcave, variational inequalities, saddle point problem, first-order method
TL;DR: We propose the first single-call single-projection algorithms with optimal convergence rate for constrained min-max optimization problems in the nonconvex-nonconcave setting.
Abstract: We study first-order methods for constrained min-max optimization. Existing methods either requires two gradient calls or two projections in each iteration, which may be costly in applications. In this paper, we first show that the \emph{Optimistic Gradient (OG)} method, a \emph{single-call single-projection} algorithm, has $O(\frac{1}{\sqrt{T}})$ convergence rate for inclusion problems with operators that satisfy the weak Minty variation inequality (MVI). Our second result is the first single-call single-projection algorithm -- the \emph{Accelerated Reflected Gradient (ARG)} method that achieves the \emph{optimal $O(\frac{1}{T})$} convergence rate for inclusion problems that satisfy negative comonotonicity. Both the weak MVI and negative comonotonicity are well-studied assumptions and capture a rich set of non-convex non-concave min-max optimization problems. Finally, we show that the \emph{Reflected Gradient (RG)} method, another \emph{single-call single-projection} algorithm, has $O(\frac{1}{\sqrt{T}})$ last-iterate convergence rate for constrained convex-concave min-max optimization, answering an open problem of (Hsieh et al., 2019).
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2210.03096/code)
0 Replies

Loading