Convergence of a stochastic subgradient method with averaging for nonsmooth nonconvex constrained optimization

Abstract: We prove convergence of a single time-scale stochastic subgradient method with subgradient averaging for constrained problems with a nonsmooth and nonconvex objective function having the property of generalized differentiability. As a tool of our analysis, we also prove a chain rule on a path for such functions.
0 Replies
Loading