A gradient sampling method with complexity guarantees for Lipschitz functions in high and low dimensionsDownload PDF

Published: 31 Oct 2022, Last Modified: 19 Jan 2023NeurIPS 2022 AcceptReaders: Everyone
Keywords: nonconvex optimization, nonsmooth optimization, nonconvex nonsmooth optimization, Goldstein subdifferential, cutting plane method.
Abstract: Zhang et al. (ICML 2020) introduced a novel modification of Goldstein's classical subgradient method, with an efficiency guarantee of $O(\varepsilon^{-4})$ for minimizing Lipschitz functions. Their work, however, makes use of an oracle that is not efficiently implementable. In this paper, we obtain the same efficiency guarantee with a standard subgradient oracle, thus making our algorithm efficiently implementable. Our resulting method works on any Lipschitz function whose value and gradient can be evaluated at points of differentiability. We additionally present a new cutting plane algorithm that achieves an efficiency of $O(d\varepsilon^{-2}\log S)$ for the class of $S$-smooth (and possibly non-convex) functions in low dimensions. Strikingly, this $\epsilon$-dependence matches the lower bounds for the convex setting.
TL;DR: We give improved complexity results for nonconvex nonsmooth optimization.
Supplementary Material: pdf
23 Replies

Loading