Accelerated Sparse Recovery via Gradient Descent with Nonlinear Conjugate Gradient MomentumDownload PDFOpen Website

Published: 01 Jan 2023, Last Modified: 12 May 2023J. Sci. Comput. 2023Readers: Everyone
Abstract: This paper applies an idea of adaptive momentum for the nonlinear conjugate gradient to accelerate optimization problems in sparse recovery. Specifically, we consider two types of minimization problems: a (single) differentiable function and the sum of a non-smooth function and a differentiable function. In the first case, we adopt a fixed step size to avoid the traditional line search and establish the convergence analysis of the proposed algorithm for a quadratic problem. This acceleration is further incorporated with an operator splitting technique to deal with the non-smooth function in the second case. We use the convex $$\ell _1$$ ℓ 1 and the nonconvex $$\ell _1-\ell _2$$ ℓ 1 - ℓ 2 functionals as two case studies to demonstrate the efficiency of the proposed approaches over traditional methods.
0 Replies

Loading