Hyperparameter optimization: a spectral approachDownload PDF

15 Feb 2018 (modified: 14 Oct 2024)ICLR 2018 Conference Blind SubmissionReaders: Everyone
Abstract: We give a simple, fast algorithm for hyperparameter optimization inspired by techniques from the analysis of Boolean functions. We focus on the high-dimensional regime where the canonical example is training a neural network with a large number of hyperparameters. The algorithm --- an iterative application of compressed sensing techniques for orthogonal polynomials --- requires only uniform sampling of the hyperparameters and is thus easily parallelizable. Experiments for training deep neural networks on Cifar-10 show that compared to state-of-the-art tools (e.g., Hyperband and Spearmint), our algorithm finds significantly improved solutions, in some cases better than what is attainable by hand-tuning. In terms of overall running time (i.e., time required to sample various settings of hyperparameters plus additional computation time), we are at least an order of magnitude faster than Hyperband and Bayesian Optimization. We also outperform Random Search $8\times$. Our method is inspired by provably-efficient algorithms for learning decision trees using the discrete Fourier transform. We obtain improved sample-complexty bounds for learning decision trees while matching state-of-the-art bounds on running time (polynomial and quasipolynomial, respectively).
TL;DR: A hyperparameter tuning algorithm using discrete Fourier analysis and compressed sensing
Keywords: Hyperparameter Optimization, Fourier Analysis, Decision Tree, Compressed Sensing
Code: [![github](/images/github_icon.svg) callowbird/Harmonica](https://github.com/callowbird/Harmonica)
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 3 code implementations](https://www.catalyzex.com/paper/hyperparameter-optimization-a-spectral/code)
7 Replies

Loading