Abstract: neural architecture search (NAS) can automatically design well-performing architectures of deep neural networks (DNNs), which have been widely investigated for real-world applications. However, NAS is computationally expensive because vast DNNs are trained to obtain the performance during the search process. Performance predictors can directly obtain the performance of DNNs without training, thus having great potential to overcome this barrier. However, existing performance predictors are typically offline, and are trained with rare architectures labeled. As a result, their predictive ability is limited due to ignoring the promising DNN architectures generated in the NAS process. In this article, we propose a Gaussian process-based online performance predictor (GPOPP) tailored to evolutionary NAS. To achieve this, GPOPP selects the proper architectures to update itself during the search process upon expected improvement as an acquisition function. Further, as architectures cannot be directly used as training data, GPOPP includes a binary encoding schema that can convert the architectures into a suitable format. Although the original intention of designing GPOPP is to provide an alternative way to build promising performance predictors that may compromise performance, the experiments show that GPOPP can outperform most state-of-the-arts. For example, GPOPP-assisted NAS gains 76.1% in accuracy on ImageNet with only 1.1 graphic processing unit (GPU) Days. In addition, the ablation studies also demonstrate the effectiveness of the components designed in GPOPP. The source code is available at https://github.com/songxt3/GPOPP.
External IDs:dblp:journals/tsmc/SongJZLGS25
Loading