Keywords: Zeroth-Order Optimization, Sharpness-Aware Minimization
Abstract: Classic zeroth-order optimization approaches typically optimize for a smoothed version of the original function, i.e., the expected objective under randomly perturbed model parameters. This can be interpreted as encouraging the loss values in the perturbation set to be small on average. Popular sharpness-aware minimization (SAM) objectives, however, focus on the largest loss within the neighborhood to arrive at flat minima more effectively. In this work, we connect zeroth-order optimization (and its corresponding objectives) with SAM approaches explicitly, grounded in an exponential tilting objective that provides a natural transition between the $\texttt{average}$ and the $\texttt{max}$. We explore new zeroth-order algorithms to solve a $\textit{soft}$ SAM objective parameterized by a tilting parameter $t$ that covers the average and min-max formulation as special cases, as well as precise characterizations of the sharpness notions of the tilted SAM framework. Practically, our approach can be used as a gradient-free and memory-efficient alternative to SAM variants, and it achieves better generalization compared to vanilla zeroth-order baselines on wide downstream tasks, including classification, multiple choice, and language generation.
Primary Area: optimization
Submission Number: 11124
Loading