Graph is All You Need? Lightweight Data-agnostic Neural Architecture Search without Training

22 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Neural Architecture Search, Network Science, Computer Vision
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Neural architecture search (NAS) enables the automatic design of neural network models. However, training the candidates generated by the search algorithm for performance evaluation incurs considerable computational overhead. Our method, dubbed NASGraph, remarkably reduces the computational costs by converting these neural architectures to graphs, and properties of the converted graphs are used as the proxy scores in lieu of validation accuracy. Our training-free NAS method is data-agnostic and light-weight. It can find the best architecture among 200 randomly sampled architectures from NAS-Bench201 in 217 CPU seconds. We are able to achieve state-of-the-art performance on 7 out of 9 datasets in NASBench-101, NASBench-201, and NDS search spaces. We also demonstrate that NASGraph generalizes to more challenging tasks on Micro TransNAS-Bench-101.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: pdf
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6348
Loading