Online variance-reducing optimizationDownload PDF

12 Feb 2018 (modified: 05 May 2023)ICLR 2018 Workshop SubmissionReaders: Everyone
Abstract: We emphasize the importance of variance reduction in stochastic methods and propose a probabilistic interpretation as a way to store information about past gradients. The resulting algorithm is very similar to the momentum method, with the difference that the weight over past gradients depends on the distance moved in parameter space rather than the number of steps.
Keywords: online learning, momentum
TL;DR: Online variance reduction looks a lot like momentum
3 Replies

Loading