VRAda: A Variance Reduced Adaptive Algorithm for Stochastic Parameter-Agnostic Minimax Optimizations

24 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Stochastic minimax optimization, Parameter-agnostic, Variance-reduction
Abstract: Stochastic parameter-agnostic minimax optimization provides a novel avenue for adjusting learning rates without relying on problem-dependent parameters, bridging the gap between theoretical and empirical machine learning results. While previous studies have successfully decoupled the timescales of primal and dual variables and proposed unified parameter-agnostic algorithms for minimax optimizations, the problem of varying inherent variances within the stochastic setting persists. Such variance degradation affects the desired ratio of learning rates. Intuitively, variance-reduced techniques hold the potential to address this issue efficiently. However, they require manually tuning problem-dependent parameters to attain an optimal solution. In this paper, we introduce the Variance-Reduced Adaptive algorithm (VRAda), a solution addressing varying inherent variances and enabling the parameter-agnostic manner in stochastic minimax optimizations. Theoretical results show that VRAda achieves an optimal sample complexity of $O(1/\epsilon^3)$ without large data batches, enabling it to find an $\epsilon$-stationary point on non-convex-strongly-concave and non-convex-Polyak-\L ojasiewicz objectives. To the best of our knowledge, VRAda is the first variance-reduced adaptive algorithm designed specifically for parameter-agnostic minimax optimization. Extensive experiments conducted across diverse applications validate the effectiveness of VRAda.
Supplementary Material: zip
Primary Area: optimization
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9118
Loading