Accelerating Evolutionary Neural Architecture Search via Multifidelity EvaluationDownload PDFOpen Website

Published: 01 Jan 2022, Last Modified: 12 May 2023IEEE Trans. Cogn. Dev. Syst. 2022Readers: Everyone
Abstract: Evolutionary neural architecture search (ENAS) has recently received increasing attention by effectively finding high-quality neural architectures, which however consumes high computational cost by training the architecture encoded by each individual for complete epochs in individual evaluation. Numerous ENAS approaches have been developed to reduce the evaluation cost, but it is often difficult for most of these approaches to achieve high evaluation accuracy. To address this issue, in this article, we propose an accelerated ENAS via multifidelity evaluation termed MFENAS, where the individual evaluation cost is significantly reduced by training the architecture encoded by each individual for only a small number of epochs. The balance between evaluation cost and evaluation accuracy is well maintained by suggesting a multifidelity evaluation, which identifies the potentially good individuals that cannot survive from previous generations by integrating multiple evaluations under different numbers of training epochs. Besides, a population initialization strategy is devised to produce diverse neural architectures varying from ResNet-like architectures to Inception-like ones. As shown by experiments, the proposed MFENAS takes only 0.6 GPU days to find the best architecture holding a 2.39% test error rate, which is superior to most state-of-the-art neural architecture search approaches. And the architectures transferred to CIFAR-100 and ImageNet also exhibit competitive performance.
0 Replies

Loading