Convergence of a stochastic subgradient method with averaging for nonsmooth nonconvex constrained optimizationDownload PDFOpen Website

Published: 01 Jan 2020, Last Modified: 05 May 2023Optim. Lett. 2020Readers: Everyone
Abstract: We prove convergence of a single time-scale stochastic subgradient method with subgradient averaging for constrained problems with a nonsmooth and nonconvex objective function having the property of generalized differentiability. As a tool of our analysis, we also prove a chain rule on a path for such functions.
0 Replies

Loading