A DNN Optimizer that Improves over AdaBelief by Suppression of the Adaptive Stepsize Range

Published: 05 Sept 2023, Last Modified: 05 Sept 2023Accepted by TMLREveryoneRevisionsBibTeX
Abstract: We make contributions towards improving adaptive-optimizer performance. Our improvements are based on suppression of the range of adaptive stepsizes in the AdaBelief optimizer. Firstly, we show that the particular placement of the parameter $\epsilon$ within the update expressions of AdaBelief reduces the range of the adaptive stepsizes, making AdaBelief closer to SGD with momentum. Secondly, we extend AdaBelief by further suppressing the range of the adaptive stepsizes. To achieve the above goal, we perform mutual layerwise vector projections between the gradient $\boldsymbol{g}_t$ and its first momentum $\boldsymbol{m}_t$ before using them to estimate the second momentum. The new optimization method is referred to as \emph{Aida}. Thirdly, extensive experimental results show that Aida outperforms nine optimizers when training transformers and LSTMs for NLP, and VGG and ResNet for image classification over CIAF10 and CIFAR100 while matching the best performance of the nine methods when training WGAN-GP models for image generation tasks. Furthermore, Aida produces higher validation accuracies than AdaBelief for training ResNet18 over ImageNet.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: Dear Editors and Reviewers, In the camera ready version, we have added the author names, and the associated affiliations. A link to the source code is also added in the abstract.
Code: https://github.com/guoqiang-zhang-x/Aida-Optimizer
Assigned Action Editor: ~Rémi_Flamary1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 1008
Loading