Fast Evolutionary Neural Architecture Search by Contrastive Predictor with Linear RegionsOpen Website

Published: 01 Jan 2023, Last Modified: 25 Feb 2024GECCO 2023Readers: Everyone
Abstract: Evolutionary neural architecture search (ENAS) has emerged as a promising approach to finding high-performance neural architectures. However, widespread application has been limited by the expensive computational costs due to the nature of evolutionary algorithms. In this study, we aim to significantly reduce the computational costs of ENAS by involving a training-free performance metric. Specifically, the network performance can be estimated by the training-free metric with only a single forward pass. However, training-free metrics have their own challenges, in particular, an insufficient correlation with ground-truth performance. We adopt a Graph Convolutional Network (GCN) based contrastive predictor which can leverage the low cost of the training-free performance metric yet improve the correlation between the estimated performance and the true performance of the candidate architectures. Combining a training-free metric - the number of linear regions with the GCN-based contrastive predictor and an active learning scheme, we propose Fast-ENAS which can achieve superior search efficiency and performance on the benchmark NAS-Bench-201 and DARTS search spaces. Furthermore, with a single GPU searching on the DARTS space, Fast-ENAS requires only 0.02 (29 minutes) and 0.026 (37 minutes) GPU days to achieve test error rates of 2.50% and 24.30% on CIFAR-10 and ImageNet respectively.
0 Replies

Loading