On the Complexity of Finding Small Subgradients in Nonsmooth OptimizationDownload PDF

Published: 23 Nov 2022, Last Modified: 05 May 2023OPT 2022 OralReaders: Everyone
Keywords: nonsmooth optimization, nonconvex optimization, oracle complexity, de-randomization
TL;DR: We study the oracle complexity of producing generalized stationary points in nonsmooth optimization in various settings, with or without randomness/convexity
Abstract: We study the oracle complexity of producing $(\delta,\epsilon)$-stationary points of Lipschitz functions, in the sense proposed by Zhang et al. [2020]. While there exist dimension-free randomized algorithms for producing such points within $\widetilde{O}(1/\delta\epsilon^3)$ first-order oracle calls, we show that no dimension-free rate can be achieved by a deterministic algorithm. On the other hand, we point out that this rate can be derandomized for smooth functions with merely a logarithmic dependence on the smoothness parameter. Moreover, we establish several lower bounds for this task which hold for any randomized algorithm, with or without convexity. Finally, we show how the convergence rate of finding $(\delta,\epsilon)$-stationary points can be improved in case the function is convex, a setting which we motivate by proving that in general no finite time algorithm can produce points with small subgradients even for convex functions.
0 Replies

Loading