Stochastic Weakly Convex Optimization Beyond Lipschitz Continuity

Published: 01 Jan 2024, Last Modified: 31 Jul 2025CoRR 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: This paper considers stochastic weakly convex optimization without the standard Lipschitz continuity assumption. Based on new adaptive regularization (stepsize) strategies, we show that a wide class of stochastic algorithms, including the stochastic subgradient method, preserve the $\mathcal{O} ( 1 / \sqrt{K})$ convergence rate with constant failure rate. Our analyses rest on rather weak assumptions: the Lipschitz parameter can be either bounded by a general growth function of $\|x\|$ or locally estimated through independent random samples.
Loading