Mutation is all you needDownload PDF

Published: 14 Jul 2021, Last Modified: 22 Oct 2023AutoML@ICML2021 PosterReaders: Everyone
Keywords: Bayesian Optimization, Neural Architecture Search, BANANAS, Benchmark
TL;DR: all that BANANAS needs for good performance on NAS-Bench-301 is using mutation as the acquisition function optimizer
Abstract: Neural architecture search (NAS) promises to make deep learning accessible to non-experts by automating architecture engineering of deep neural networks. BANANAS is one state-of-the-art NAS method that is embedded within the Bayesian optimization framework. Recent experimental findings have demonstrated the strong performance of BANANAS on the NAS-Bench-101 benchmark being determined by its path encoding and not its choice of surrogate model. We present experimental results suggesting that the performance of BANANAS on the NAS-Bench-301 benchmark is determined by its acquisition function optimizer, which minimally mutates the incumbent.
Ethics Statement: There are no ethical concerns to consider.
Crc Pdf: pdf
Poster Pdf: pdf
Original Version: pdf
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2107.07343/code)
4 Replies

Loading