Abstract: Neural architecture search methods are able to find high performance deep learning architectures with minimal effort from an expert \cite{elsken2018neural}. However, current systems focus on specific use-cases (e.g. convolutional image classifiers and recurrent language models), making them unsuitable for general use-cases that an expert might wish to write. Hyperparameter optimization systems \cite{bergstra2013hyperopt,snoek2012practical,falkner2018bohb} are more general-purpose, but lack the constructs needed for easy application to architecture search. In this work, we propose a formal language for encoding search spaces over arbitrary computational graphs. The language constructs allow us to write modular, composable, and reusable search space encodings and to reason about search space design. We use our language to encode search spaces from the architecture search literature. The language allows us to decouple the implementations of the search space and search algorithm. Our experiments demonstrate the ease with which we can expose search algorithms to search spaces through a consistent interface and experiment with different combinations of search space/algorithm without implementing each combination from scratch. We release an implementation of our language with this paper.
Code Link: The code used to run the experiments reported in the paper can be found at https://github.com/negrinho/negrinho2019towards. The framework will continue to be maintained at https://github.com/negrinho/deep_architect, so we recommend building on that implementation instead.
CMT Num: 7624
0 Replies
Loading