Keywords: Non-convex Optimization, Nash Equilibrium, Gradient Dominance, Strict Saddle
Abstract: We consider the optimization problem of finding Nash Equilibrium (NE) for a nonconvex function $f(x)=f(x_1,...,x_n)$, where $x_i\in\mathbb{R}^{d_i}$ denotes the $i$-th block of the variables.
Our focus is on investigating first-order gradient-based algorithms and their variations such as the block coordinate descent (BCD) algorithm for tackling this problem.
We introduce a set of conditions, termed the $n$-sided PL condition, which extends the well-established gradient dominance condition a.k.a Polyak-{\L}ojasiewicz (PL) condition and the concept of multi-convexity. This condition, satisfied by various classes of non-convex functions, allows us to analyze the convergence of various gradient descent (GD) algorithms.
Moreover, our study delves into scenarios where the objective function only has strict saddle points, and normal gradient descent methods fail to converge to NE. In such cases, we propose adapted variants of GD that converge towards NE and analyze their convergence rates.
Supplementary Material: zip
Primary Area: optimization
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 4720
Loading