Proximal-like incremental aggregated gradient method with Bregman distance in weakly convex optimization problemsDownload PDFOpen Website

Published: 01 Jan 2021, Last Modified: 13 May 2023J. Glob. Optim. 2021Readers: Everyone
Abstract: We focus on a special nonconvex and nonsmooth composite function, which is the sum of the smooth weakly convex component functions and a proper lower semi-continuous weakly convex function. An algorithm called the proximal-like incremental aggregated gradient (PLIAG) method proposed in Zhang et al. (Math Oper Res 46(1): 61–81, 2021) is proved to be convergent and highly efficient to solve convex minimization problems. This algorithm can not only avoid evaluating the exact full gradient which can be expensive in big data models but also weaken the stringent global Lipschitz gradient continuity assumption on the smooth part of the problem. However, under the nonconvex case, there is few analysis on the convergence of the PLIAG method. In this paper, we prove that the limit point of the sequence generated by the PLIAG method is the critical point of the weakly convex problems. Under further assumption that the objective function satisfies the Kurdyka–Łojasiewicz (KL) property, we prove that the generated sequence converges globally to a critical point of the problem. Additionally, we give the convergence rate when the Łojasiewicz exponent is known.
0 Replies

Loading