On the Convergence of the Stochastic Primal-Dual Hybrid Gradient for Convex Optimization

Published: 01 Jan 2020, Last Modified: 01 Oct 2024CoRR 2020EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The Stochastic Primal-Dual Hybrid Gradient (SPDHG) was proposed by Chambolle et al. (2018) and is an efficient algorithm to solve some nonsmooth large-scale optimization problems. In this paper we prove its almost sure convergence for convex but not necessarily strongly convex functionals. We also look into its application to parallel Magnetic Resonance Imaging reconstruction in order to test performance of SPDHG. Our numerical results show that for a range of settings SPDHG converges significantly faster than its deterministic counterpart.
Loading