Keywords: optimization, extragradient, min-max, variational inequality problem
TL;DR: We propose a novel step size strategy for the extragradient method to solve root-finding and min-max optimization problems under relaxed Lipschitz assumptions
Abstract: Introduced by Korpelevich in 1976, the extragradient method (EG) has become a cornerstone technique for solving min-max optimization, root-finding problems, and variational inequalities (VIs). Despite its longstanding presence and significant attention within the optimization community, most works focusing on understanding its convergence guarantees assume the strong $L$-Lipschitz condition. In this work, building on the proposed assumptions by Zhang et al. [2019] for minimization and  Vankov et al. [2024a] for VIs, we focus on the more relaxed $\alpha$-symmetric $(L_0, L_1)$-Lipschitz condition. This condition generalizes the standard Lipschitz assumption by allowing the Lipschitz constant to scale with the operator norm, providing a more refined characterization of problem structures in modern machine learning. Under the $\alpha$-symmetric $(L_0, L_1)$-Lipschitz condition, we propose a novel step size strategy for EG to solve root-finding problems and establish sublinear convergence rates for monotone operators and linear convergence rates for strongly monotone operators. Additionally, we prove local convergence guarantees for weak Minty operators. We supplement our analysis with experiments validating our theory and demonstrating the effectiveness and robustness of the proposed step sizes for EG.
Supplementary Material:  zip
Primary Area: Optimization (e.g., convex and non-convex, stochastic, robust)
Submission Number: 13630
Loading