Generalized Smooth Stochastic Variational Inequalities: Almost Sure Convergence and Convergence Rates
Abstract: This paper focuses on solving a stochastic variational inequality (SVI) problem under relaxed smoothness assumption for a class of structured non-monotone operators. The SVI problem has attracted significant interest in the machine learning community due to its immediate application to adversarial training and multi-agent reinforcement learning. In many such applications, the resulting operators do not satisfy the smoothness assumption. To address this issue, we focus on a weaker generalized smoothness assumption called $\alpha$-symmetric. Under $p$-quasi sharpness and $\alpha$-symmetric assumptions on the operator, we study clipped projection (gradient descent-ascent) and clipped Korpelevich (extragradient) methods. For these clipped methods, we provide the first almost-sure convergence results without making any assumptions on the boundedness of either the stochastic operator or the stochastic samples. We also provide the first in-expectation unbiased convergence rate results for these methods under a relaxed smoothness assumption for $\alpha \leq \frac{1}{2}$.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: We revised the paper by correcting errors and updating the proofs of the almost sure convergence results. We added further discussion on convergence guarantees, rates, and method comparisons, and expanded the numerical experiments. In addition, we corrected typos and incorporated the changes requested by the reviewers.
Supplementary Material: zip
Assigned Action Editor: ~Yunwen_Lei1
Submission Number: 5308
Loading