ROMUL: Scale Adaptative Population Based TrainingDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Blind SubmissionReaders: Everyone
Keywords: hyperparameter search, population based training, differential evolution, hyperparameter optimization, online optimization, deep learning
Abstract: In most pragmatic settings, data augmentation and regularization are essential, and require hyperparameter search. Population based training (PBT) is an effective tool for efficiently finding them as well as schedules over hyperparameters. In this paper, we compare existing PBT algorithms and contribute a new one: ROMUL, for RObust MULtistep search, which adapts its stepsize over the course of training. We report competitive results with standard models on CIFAR (image classification) as well as Penn Tree Bank (language modeling), which both depend on heavy regularization. We also open-source hoptim, a PBT library agnostic to the training framework, which is simple to use, reentrant, and provides good defaults with ROMUL.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
One-sentence Summary: We benchmark several PBT algorithms and propose a new one that shows to be robust across domains.
Supplementary Material: zip
Reviewed Version (pdf): https://openreview.net/references/pdf?id=dHNshm47C
13 Replies

Loading