DAS$^2$C: A Distributed Adaptive Minimax Method with Near-Optimal Convergence

24 Sept 2023 (modified: 27 Jan 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Minimax Optimization, Distributed Learning, Nonconvex Optimization, Convergence Analysis, Stepsize Inconsistency
TL;DR: This paper proposes a distributed adaptive method for solving nonconvex minimax problems and establishes a near-optimal convergence rate by employing adaptive stepsize control to eliminate the inconsistency in locally computed adaptive stepsizes.
Abstract: Applying adaptive methods directly to distributed minimax problems can result in non-convergence due to inconsistency in locally computed adaptive stepsizes. To address this challenge, we propose DAS$^2$C, a $\underline{\text{D}}$istributed $\underline{\text{A}}$daptive method with time-scale \$\underline{\text{S}}$eparated $\underline{\text{S}}$tepsize $\underline{\text{C}}$ontrol for minimax optimization. The key strategy is to employ an adaptive stepsize control protocol involving the transmission of two extra (scalar) variables. This protocol ensures the consistency among stepsizes of nodes, eliminating the steady-state errors due to the lack of coordination of stepsizes among nodes that commonly exists in vanilla distributed adaptive methods, and thus guarantees exact convergence. For non-convex-strongly-concave distributed minimax problems, we characterize the specific transient times that ensure time-scale separation and quasi-independence of networks, leading to a near-optimal convergence rate of $\tilde{\mathcal{O}} \left( \epsilon ^{-\left( 4+\delta \right)} \right)$ for any small $\delta > 0$, matching that of the centralized counterpart. To the best of our knowledge, DAS$^2$C is the $\textit{first}$ distributed adaptive method guaranteeing exact convergence without requiring to know any problem-dependent parameters for nonconvex minimax problems.
Supplementary Material: zip
Primary Area: optimization
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9142
Loading