Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Practical Hyperparameter Optimization
Stefan Falkner, Aaron Klein, Frank Hutter
Feb 12, 2018 (modified: Feb 12, 2018)ICLR 2018 Workshop Submissionreaders: everyone
Abstract:Recently, the bandit-based strategy Hyperband (HB) was shown to yield good
hyperparameter settings of deep neural networks faster than vanilla Bayesian
optimization (BO). However, for larger budgets, HB is limited by its random search
component, and BO works better. We propose to combine the benefits of both
approaches to obtain a new practical state-of-the-art hyperparameter optimization
method, which we show to consistently outperform both HB and BO on a range
of problem types, including feed-forward neural networks, Bayesian neural networks,
and deep reinforcement learning. Our method is robust and versatile, while
at the same time being conceptually simple and easy to implement.
TL;DR:We combine Bayesian optimization and Hyperband to obtain a practical hyperparameter optimizer that consistently outperforms both of them.