Fundamental Convergence Analysis of Sharpness-Aware Minimization

Published: 26 Sept 2024, Last Modified: 30 Sept 2024NeurIPS 2024EveryoneCC0 1.0
Abstract: The paper investigates the fundamental convergence properties of Sharpness-Aware Minimization (SAM), a recently proposed gradient-based optimization method \citep{foret21} that significantly improves the generalization of deep neural networks. The convergence properties including the stationarity of accumulation points, the convergence of the sequence of gradients to the origin, the sequence of function values to the optimal value, and the sequence of iterates to the optimal solution are established for the method. The universality of the provided convergence analysis based on inexact gradient descent frameworks \citet{kmt23.1} allows its extensions to efficient normalized versions of SAM such as F-SAM \citep{li2024friendly}, VaSSO \citep{li23vasso}, RSAM \citep{liu22}, and to the unnormalized versions of SAM such as USAM \citep{maksym22}. Numerical experiments are conducted on classification tasks using deep learning models to confirm the practical aspects of our analysis.
Loading