A Deeper Look at Zero-Cost Proxies for Lightweight NAS

Anonymous

Published: 28 Mar 2022, Last Modified: 05 May 2023BT@ICLR2022Readers: Everyone
Keywords: Neural Architecture Search, NAS, AutoML, Zero-Cost Proxies
Abstract: While early algorithms for neural architecture search (NAS) used extreme computational budgets, recent techniques have aimed to lower the computation time. Very recently, a new family of techniques were introduced to approximate the performance of neural architectures in just five seconds: so-called "zero-cost proxies" (Mellor et al. 2020; Abdelfattah et al. 2021). However, two recent papers have showed that simple baselines such as "number of parameters" are competitive, casting doubt on the efficacy of zero-cost proxies (Ning et al. 2021, Chen et al. 2021). In this blog post, we take a deeper look at zero-cost proxies for NAS. We survey prior work and then run new experiments using the recent NAS-Bench-360 and TransNAS-Bench-101 benchmarks, which give a much more diverse set of datasets and tasks than all prior work. We find that there are no clear "best" zero-cost proxies across all tasks, and we confirm that simple baselines are consistently competitive with proposed zero-cost proxies. However, we also conclude that zero-cost proxies have great potential in NAS, especially when used in combination with other methods or to improve the performance of existing methods at very little additional cost. Overall, we provide a landscape overview of this promising area, highlight strengths and weaknesses, and shed light on future research in this direction.
Submission Full: zip
Blogpost Url: yml
ICLR Paper: https://arxiv.org/abs/2101.08134
2 Replies

Loading