Stochastic FISTA Step Search Algorithm for Convex Optimization

Published: 26 Oct 2023, Last Modified: 13 Dec 2023NeurIPS 2023 Workshop PosterEveryoneRevisionsBibTeX
Keywords: convex optimization, gradient methods, accelerated momentum
TL;DR: We propose an accelerated stochastic step search algorithm for convex optimization.
Abstract: In this paper, we propose an accelerated stochastic step search algorithm which combines an accelerated method with a fully adaptive step size parameter for convex problems in (Scheinberg et. al., 2014) with stochastic step search analysis in (Paquette and Scheinberg, 2020). Under appropriate conditions on the accuracy of the estimates of gradient and function value our algorithm achieves expected iteration complexity of $\mathcal{O}(1/\sqrt{\epsilon})$ to reach an $\epsilon$-accurate solution which satisfies $f(x) - f_* \leq \epsilon$. This complexity matches with the iteration complexity of deterministic Nesterov's accelerated and FISTA algorithms (Nesterov, 1983, Beck and Teboulle, 2009). This paper continues the line of work on stochastic adaptive algorithms studied in (Berahas et. al., 2021, Blanchet et. al., 2019, Paquette and Scheinberg, 2020) and is the first one to develop an accelerated gradient descent type algorithm in this domain.
Submission Number: 84