Accelerating Perturbed Stochastic Iterates in Asynchronous Lock-Free OptimizationDownload PDF

Published: 23 Nov 2022, Last Modified: 05 May 2023OPT 2022 PosterReaders: Everyone
Keywords: Finite-Sum Convex Optimization, Momentum Method, Asynchronous Lock-Free Optimization, Perturbed Iterate Analysis
TL;DR: We propose a new asynchronous lock-free accelerated SVRG method which achieves the optimal oracle complexity under the perturbed iterate framework (Mania et al., 2017).
Abstract: We show that stochastic acceleration can be achieved under the perturbed iterate framework (Mania et al., 2017) in asynchronous lock-free optimization, which leads to the optimal incremental gradient complexity for finite-sum objectives. We prove that our new accelerated method requires the same linear speed-up condition as existing non-accelerated methods. Our key algorithmic discovery is a new accelerated SVRG variant with sparse updates. Empirical results are presented to verify our theoretical findings.
0 Replies

Loading