Lookahead Optimizer: k steps forward, 1 step backDownload PDF

02 Dec 2019 (modified: 05 May 2023)NeurIPS 2019 Reproducibility Challenge Blind ReportReaders: Everyone
Abstract: Neural networks have helped achieve state of the art results in various challenging tasks. However, training neural networks is a difficult affair and consumes a lot of time and memory. Optimizers constitute an important component of any neural network and an efficient optimization algorithm can significantly reduce the training time required to achieve sate of the art results. We replicate the novel lookahead optimizer as proposed by Zhang et al. and demonstrate its performance on two deep learning tasks, namely classification of Cifar10 and Cifar100.
Track: Replicability
NeurIPS Paper Id: /forum?id=BklyySSlIS
4 Replies

Loading