Last-Iterate Convergence of Optimistic Gradient Method for Monotone Variational InequalitiesDownload PDF

Published: 31 Oct 2022, Last Modified: 12 Jan 2023NeurIPS 2022 AcceptReaders: Everyone
Keywords: optimistic gradient method, monotone variational inequalities, last-iterate convergence, computer-aided proofs
TL;DR: The first analysis of last-iterate convergence of Optimistic Gradient method for monotone Lipschitz variational inequalities (constrained and unconstrained ones) without assuming Lipschitzness of the Jacobian
Abstract: The Past Extragradient (PEG) [Popov, 1980] method, also known as the Optimistic Gradient method, has known a recent gain in interest in the optimization community with the emergence of variational inequality formulations for machine learning. Recently, in the unconstrained case, Golowich et al. [2020] proved that a $O(1/N)$ last-iterate convergence rate in terms of the squared norm of the operator can be achieved for Lipschitz and monotone operators with a Lipchitz Jacobian. In this work, by introducing a novel analysis through potential functions, we show that (i) this $O(1/N)$ last-iterate convergence can be achieved without any assumption on the Jacobian of the operator, and (ii) it can be extended to the constrained case, which was not derived before even under Lipschitzness of the Jacobian. The proof is significantly different from the one known from Golowich et al. [2020], and its discovery was computer-aided. Those results close the open question of the last iterate convergence of PEG for monotone variational inequalities.
Supplementary Material: pdf
17 Replies

Loading