Keywords: Nonsmooth Nonconvex Optimization, Decentralized Optimization, Online Learning
TL;DR: We propose a decentralized algorithm for nonsmooth nonconvex optimization, achieving better sample complexity, communication rounds and computation rounds than the previous algorithms in first-order and zero-order cases.
Abstract: This paper considers decentralized nonsmooth nonconvex optimization problem with Lipschitz continuous local functions.
We propose an efficient stochastic first-order method with client sampling, achieving the $(\delta,\epsilon)$-Goldstein stationary point with the overall sample complexity of ${\mathcal O}(\delta^{-1}\epsilon^{-3})$, the computation rounds of ${\mathcal O}(\delta^{-1}\epsilon^{-3})$,
and the communication rounds of ${\tilde{\mathcal O}}(\gamma^{-1/2}\delta^{-1}\epsilon^{-3})$, where $\gamma$ is the spectral gap of the mixing matrix for the network.
Our results achieve the optimal sample complexity and the sharper communication complexity than existing methods.
We also extend our ideas to zeroth-order optimization.
Moreover, the numerical experiments show the empirical advantage of our methods.
Supplementary Material: zip
Primary Area: optimization
Submission Number: 14793
Loading