Choose Your Path Wisely: Gradient Descent in a Bregman Distance FrameworkOpen Website

Published: 01 Jan 2021, Last Modified: 15 May 2023SIAM J. Imaging Sci. 2021Readers: Everyone
Abstract: We propose an extension of a special form of gradient descent---in the literature known as linearized Bregman iteration---to a larger class of nonconvex functions. We replace the classical (squared) two norm metric in the gradient descent setting with a generalized Bregman distance, based on a proper, convex, and lower semicontinuous function. The algorithm's global convergence is proven for functions that satisfy the Kurdyka--Łojasiewicz property. Examples illustrate that features of different scale are being introduced throughout the iteration, transitioning from coarse to fine. This coarse-to-fine approach with respect to scale allows us to recover solutions of nonconvex optimization problems that are superior to those obtained with conventional gradient descent, or even projected and proximal gradient descent. The effectiveness of the linearized Bregman iteration in combination with early stopping is illustrated for the applications of parallel magnetic resonance imaging, blind deconvolution, as well as image classification with neural networks.
0 Replies

Loading