Feature Activation-Driven Zero-Shot NAS: A Contrastive Learning Framework

Published: 01 Jan 2024, Last Modified: 13 Nov 2024ICANN (1) 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The main bottleneck in current neural architecture search (NAS) algorithms is the inability to efficiently evaluate various neural architectures. While existing performance predictor-based approaches have significantly reduced the time required to assess alternative architectures, they still necessitate the training of a substantial number of architectures. Unfortunately, this training process consumes considerable computational resources. To address this challenge, we introduce a novel training-free evaluation metric rooted in the principles of contrastive learning. This innovative metric evaluates architecture performance by analyzing the differential responses elicited from positive and negative samples within the deep architecture. By leveraging the insights of contrastive learning, it offers a more resource-efficient solution for assessing neural architectures. We prove the superiority of our method with evaluation benchmarks such as NAS-bench-101 and NAS-bench-201. This method combined with the search strategy finally achieved an accuracy of 97.5 % on CIFAR-10. The code for our work is available at https://github.com/wdi-nancy/FADZS-NAS.
Loading