Keywords: Variational inequalities, Convex optimization, Variance reduction
Abstract: Variational inequalities (VIs) have emerged as a universal framework for solving a wide range of problems. A broad spectrum of applications include optimization, equilibrium analysis, reinforcement learning, and the rapidly evolving field of generative adversarial networks (GANs). Stochastic methods have proven to be powerful tools for addressing such problems, but they often suffer from irreducible variance, necessitating the development of variance reduction techniques. Among these, SARAH-based algorithms have demonstrated remarkable practical effectiveness. In this work, we propose a new stochastic variance reduced algorithm for solving stochastic variational inequalities.
We push the boundaries of existing methodologies by leveraging PAGE method to solve VIs. Unlike prior studies, which lacked theoretical guarantees under general assumptions, we establish rigorous convergence rates, thus closing a crucial gap in the literature. Our contributions extend both theoretical understanding and practical advancements in solving variational inequalities.
To substantiate our claims, we conduct extensive experiments across diverse benchmarks including widely studied denoising task. The results consistently showcase the superior efficiency of our approach, underscoring its potential for real-world applications.
Latex Source Code: zip
Signed PMLR Licence Agreement: pdf
Readers: auai.org/UAI/2025/Conference, auai.org/UAI/2025/Conference/Area_Chairs, auai.org/UAI/2025/Conference/Reviewers, auai.org/UAI/2025/Conference/Submission23/Authors, auai.org/UAI/2025/Conference/Submission23/Reproducibility_Reviewers
Submission Number: 23
Loading