Abstract: Sharpness-Aware Minimization (SAM) is an iterative optimization process to train neural networks, by which the training is guided to find flat minima, such that the solution found at convergence may generalize well. However, previous studies on the convergence of SAM have only shown the existence of such a solution at arbitrary iteration. We prove that SAM converges at its last iteration almost surely.
6 Replies
Loading