TL;DR: We develop two novel stochastic variance-reduced optimistic gradient-type methods to solve a class of nonmontone generalized equations.
Abstract: We develop two novel stochastic variance-reduction methods to approximate solutions of a class of nonmonotone [generalized] equations. Our algorithms leverage a new combination of ideas from the forward-reflected-backward splitting method and a class of unbiased variance-reduced estimators. We construct two new stochastic estimators within this class, inspired by the well-known SVRG and SAGA estimators. These estimators significantly differ from existing approaches used in minimax and variational inequality problems. By appropriately choosing parameters, both algorithms achieve state-of-the-art oracle complexity of $\mathcal{O}(n + n^{2/3} \epsilon^{-2})$ for obtaining an $\epsilon$-solution in terms of the operator residual norm for a class of nonmonotone problems, where $n$ is the number of summands and $\epsilon$ signifies the desired accuracy. This complexity aligns with the best-known results in SVRG and SAGA methods for stochastic nonconvex optimization. We test our algorithms on some numerical examples and compare them with existing methods. The results demonstrate promising improvements offered by the new methods compared to their competitors.
Lay Summary: We develop two new variance-reduced algorithms based on the forward-reflected-backward splitting method to tackle a class of nonmonotone root-finding problems. These methods encompass both SVRG and SAGA estimators as special cases. By carefully selecting the parameters, our algorithms achieve the state-of-the-art oracle complexity for attaining an $\epsilon$-solution, matching the state-of-the-art complexity bounds observed in nonconvex optimization methods using SVRG and SAGA. While the first scheme resembles a stochastic variant of the optimistic gradient method, the second one is entirely novel and distinct from existing approaches, even their deterministic counterparts. We validate our methods through numerical examples, and the results demonstrate promising performance compared to existing techniques under careful parameter selections.
Primary Area: Optimization->Stochastic
Keywords: variance-reduction; forward-reflected-backward splitting method; nonmonotone generalized equations; SVRG; SAGA
Submission Number: 7959
Loading