Evolutionary Neural Architecture Search with Performance Predictor Based on Hybrid Encodings

15 Aug 2024 (modified: 21 Aug 2024)IEEE ICIST 2024 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: ENAS with Performance Predictor
Abstract: Neural architecture search (NAS) has received a lot of attention since the development of deep neural networks (DNNs) in various scientific and application fields. By learning the relationship between neural network architectures and their corresponding performance, the performance predictor which plays a critical role in the NAS methods exactly improves the efficiency. However, the efficiency of performace predictors mainly depends on the training approaches of performance predictors and the encoding approches of neural network architectures. In this paper, we propose a hybrid encoding-based predictor building upon two computation-aware encodings with different training approaches. It trains a generative module by unsupervised learning to better encode architectures and a graph flow module by supervised learning to reduce the cost of evaluated architectures, which are beneficial to the search for the optimal architecture representation in the latent space. Additionally, an evolutionary neural architecture search method (HEP-ENAS) is proposed to efficiently explore the promising architectures by applying the hybrid encoding-based performance predictor to the covariance matrix adaptation evolution strategy (CMA-ES). A series of experiments conducted on NAS-Benmarks demonstrate the benefits of hybrid encoding-based predictor for searching for the optimal architecture in the latent space and the effectiveness of HEP-ENAS compared with popular NAS methods.
Submission Number: 170
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview