Graph is All You Need? Lightweight Data-agnostic Neural Architecture Search without Training

Published: 12 Jul 2024, Last Modified: 09 Aug 2024AutoML 2024 WorkshopEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Neural Architecture Search, Graph Theory, Network
Abstract: Neural network search (NAS) automates the design of neural architectures. However, training the candidates generated by the search algorithm for performance evaluation incurs considerable computational overhead. We propose a method dubbed NASGraph that remarkably reduces computational costs by converting neural architectures to graphs and searching in the graph space. We empirically find a graph measure average degree, despite its simplicity, powerful enough as the NAS proxy in lieu of the evaluation metric. Our proposed method is training-free, data-agnostic, and lightweight. Besides, our method is able to achieve competitive performance on various NAS benchmarks including NASBench-101, NASBench-201, and NDS. We also demonstrate that NASGraph generalizes to more challenging tasks on Micro TransNAS-Bench-101.
Submission Checklist: Yes
Broader Impact Statement: Yes
Paper Availability And License: Yes
Code Of Conduct: Yes
Optional Meta-Data For Green-AutoML: All questions below on environmental impact are optional.
Submission Number: 15
Loading