Nonlinearly Preconditioned Gradient Methods under Generalized Smoothness

Published: 23 Jun 2025, Last Modified: 23 Jun 2025Greeks in AI 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: nonconvex optimization, generalized smoothness, first-order methods
Abstract: We analyze nonlinearly preconditioned gradient methods for solving smooth minimiza- tion problems. We introduce a generalized smoothness property, based on the notion of abstract convexity, that is broader than Lipschitz smoothness and provide sufficient first- and second-order conditions. Notably, our framework encapsulates algorithms associated with the clipping gradient method and brings out novel insights for the class of (L0,L1)- smooth functions that has received widespread interest recently, thus allowing us to go beyond already established methods. We investigate the convergence of the proposed method in both the convex and nonconvex setting.
Submission Number: 139
Loading