EZNAS: Evolving Zero-Cost Proxies For Neural Architecture ScoringDownload PDF

Published: 31 Oct 2022, Last Modified: 21 Dec 2022NeurIPS 2022 AcceptReaders: Everyone
Keywords: Zero Shot Neural Architecture Search, Neural Architecture Search, Program Synthesis, Evolutionary Search, Genetic Programming
TL;DR: We utilize genetic programming to discover zero cost neural architecture scoring metrics which outperform all existing metrics.
Abstract: Neural Architecture Search (NAS) has significantly improved productivity in the design and deployment of neural networks (NN). As NAS typically evaluates multiple models by training them partially or completely, the improved productivity comes at the cost of significant carbon footprint. To alleviate this expensive training routine, zero-shot/cost proxies analyze an NN at initialization to generate a score, which correlates highly with its true accuracy. Zero-cost proxies are currently designed by experts conducting multiple cycles of empirical testing on possible algorithms, datasets, and neural architecture design spaces. This experimentation lowers productivity and is an unsustainable approach towards zero-cost proxy design as deep learning use-cases diversify in nature. Additionally, existing zero-cost proxies fail to generalize across neural architecture design spaces. In this paper, we propose a genetic programming framework to automate the discovery of zero-cost proxies for neural architecture scoring. Our methodology efficiently discovers an interpretable and generalizable zero-cost proxy that gives state of the art score-accuracy correlation on all datasets and search spaces of NASBench-201 and Network Design Spaces (NDS). We believe that this research indicates a promising direction towards automatically discovering zero-cost proxies that can work across network architecture design spaces, datasets, and tasks.
Supplementary Material: pdf
14 Replies

Loading