Finding Second-order Stationary Points for Generalized-Smooth Nonconvex Minimax Optimization via Gradient-based Algorithm
Keywords: Minimax Optimization, Nonconvex Optimization, Generalized Smoothness, Second-order Stationary Point
Abstract: Nonconvex minimax problems have received intense interest in many machine learning applications such as generative adversarial network, robust optimization and adversarial
Recently, a variety of minimax optimization algorithms based on Lipschitz smoothness
for finding first-order or second-order stationary points have been proposed.
However, the standard Lipschitz continuous gradient or Hessian assumption could fail to hold even in some classic minimax problems,
rendering conventional minimax optimization algorithms fail to converge in practice.
To address this challenge, we demonstrate a new gradient-based method for nonconvex-strongly-concave minimax optimization
under a generalized smoothness assumption.
Motivated by the important application of escaping saddle points, we propose a generalized Hessian smoothness condition,
under which our gradient-based method can achieve the complexity of $\mathcal{O}(\epsilon^{-1.75}\log n)$
to find a second-order stationary point with only gradient calls involved,
which improves the state-of-the-art complexity results for the nonconvex minimax optimization
even under standard Lipschitz smoothness condition.
To the best of our knowledge, this is the first work to show convergence
for finding second-order stationary points on nonconvex minimax optimization with generalized smoothness.
The experimental results on the application of domain adaptation confirm the superiority of our algorithm compared with existing methods.
Supplementary Material: pdf
Primary Area: optimization
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 4801
Loading