Towards Reproducible Neural Architecture and Hyperparameter SearchDownload PDF

Published: 27 Jun 2018, Last Modified: 05 May 2023ICML 2018 RML SubmissionReaders: Everyone
Abstract: Recent advances in neural architecture and hyperparameter search demand tremendous computational resources which makes it almost impossible to reproduce experiments. We argue that this hinders the progress in this subfield since new methods can not be thoroughly compared to already existing methods. In this work, we generated a new benchmark for neural architecture search and hyperparameter optimization which is based on tabular data for a feed forward neural network. Each function evaluation is just a simple table look up and thus takes only milliseconds but mimics the true underlying optimization problem. Furthermore, we analyze the properties of this benchmark and compare a range of state-of-the-art neural architecture and hyperparameter search methods.
Keywords: Benchmark, Hyperparameter Optimization, Neural Architecture Search
TL;DR: We present a new benchmark for hyperparameter optimization and neural architecture search that is cheap to evaluate but mimics the true optimization problem.
1 Reply