Receptive Field Reliant Zero-Cost Proxies for Neural Architecture Search

Published: 01 Jan 2023, Last Modified: 25 Apr 2024ICASSP 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Neural Architecture Search (NAS) is a fast growing technology for automatic design of deep-learning architectures. NAS includes three stages: search space design, search strategy, and evaluation criterion. Among these, the evaluation of various architectures is very cost-intensive task. In this work, we have proposed a set of receptive field reliant zero-cost proxies which need only one iteration of training and thereby reduce the computational time associated with evaluation criterion during the NAS. The proposed zero-cost proxies are based on layer-wise binding of the prune-at-initialization score with its receptive field for more effective measure as compared to the vanilla counterparts to achieve generalizability. The proposed zero-cost proxies are validated on the set of PyTorchCV models, and NAS-Bench-201 benchmarking datasets. The proposed zero-cost proxies have performed better for set of PyTorchCV models and competitively with vanilla counterparts for NAS-Bench-201. The efficiency of the proposed method is also demonstrated in NAS on NAS-Bench-201 using Aging Evolution as controller.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview