Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter OptimizationDownload PDF

Published: 14 Jul 2021, Last Modified: 22 Oct 2023AutoML@ICML2021 PosterReaders: Everyone
Keywords: AutoML, Hyperparameter Optimization, Neural Architecture Search, Multi-Objective Optimization
TL;DR: We propose a set of methods that extend current approaches to jointly optimize neural architectures and hyperparameters with respect to multiple objectives.
Abstract: While both neural architecture search (NAS) and hyperparameter optimization (HPO) have been studied extensively in recent years, NAS methods typically assume fixed hyperparameters and vice versa. Furthermore, NAS has recently often been framed as a multi-objective optimization problem, in order to take, e.g., resource requirements into account. In this paper, we propose a set of methods that extend current approaches to jointly optimize neural architectures and hyperparameters with respect to multiple objectives. We hope that these methods will serve as simple baselines for future research on multi-objective joint NAS + HPO.
Ethics Statement: Our proposed methods are automating machine learning algorithms and thus it might be accompanied by the risk of job losses, as is the case for any automation process. There is a risk that our methods might be used, e.g., for military purposes. On the other side, it will also result in the practitioner and/or researcher having more time for other fulfilling parts of their jobs and thus might result in more productivity and innovation. Furthermore, we hope that our methodological contribution will enable a wide range of non-experts using deep learning rather than limiting their application to a smaller group of companies with the resources to hire deep learning experts.
Crc Pdf: pdf
Poster Pdf: pdf
Original Version: pdf
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2105.01015/code)
3 Replies

Loading