A class of accelerated GADMM-based method for multi-block nonconvex optimization problems

Published: 01 Jan 2025, Last Modified: 13 May 2025Numer. Algorithms 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: To improve the computational efficiency, based on the generalized alternating direction method of multipliers (GADMM), we consider a class of accelerated method for solving multi-block nonconvex and nonsmooth optimization problems. First, we linearize the smooth part of the objective function and add proximal terms in subproblems, resulting in the proximal linearized GADMM. Then, we introduce an inertial technique and give the inertial proximal linearized GADMM. The convergence of the regularized augmented Lagrangian function sequence is proved under some appropriate assumptions. When some component functions of the objective function are convex, we use the error bound condition and obtain that the sequences generated by the algorithms locally converge to the critical point in a R-linear rate. Moreover, we apply the proposed algorithms to SCAD and robust PCA problems to verify the efficiency of the algorithms.
Loading