A linearly convergent stochastic recursive gradient method for convex optimization

Published: 16 Feb 2020, Last Modified: 15 May 2025OpenReview Archive Direct UploadEveryoneCC BY 4.0
Abstract: The stochastic recursive gradient algorithm (SARAH) attracts much interest recently. It admits a simple recursive framework for updating stochastic gradient estimates. Motivated by this, in this paper we propose a new stochastic recursive gradient method, called SARAH-I. Different from SARAH, SARAH-I incorporates importance sampling strategy and computes the full gradient at the last iterate in each inner iteration. We show that the sequence of distances between iterates and the optima set is linearly convergent under both strong convexity and non-strong convexity conditions. Furthermore, we propose to use the Barzilai–Borwein (BB) approach to adaptively compute step sizes for SARAH-I, and name the resulting method as SARAH-I-BB. We establish its convergence and complexity properties in different cases. Finally numerical tests are reported to indicate promising performances of proposed algorithms.
Loading