Training-Free Multi-Objective and Many-Objective Evolutionary Neural Architecture Search with Synaptic Flow
Abstract: Neural architecture search (NAS) algorithms often suffer from the expensive computation cost because a sufficient number of candidate architectures need to be evaluated during the search. Each architecture evaluation involves hundreds of training epochs to obtain proper weights for computing the accuracy of that architecture. Recently, a training-free performance metric Synaptic Flow has been proposed to facilitate these architecture evaluations. Synaptic Flow can be computed using randomly initialized network weights and its values are found to have certain correlation degree with network test accuracy. Furthermore, in real-world neural architecture designing, network performance (e.g., test accuracy) is not the sole objective, and network complexity metrics (e.g., the number of parameters, latency) are also considered. In this paper, we investigate several multi-objective NAS problem formulations, where each involves one performance metric and one complexity metric, and a many-objective NAS problem formulation, that involves one performance metric and four complexity metrics. We consider two variants of the performance metric for each formulation: a training-based variant that employs network accuracy and a training-free variant that employs Synaptic Flow. We use the non-dominated sorting genetic algorithm II to solve these NAS problem formulations, and then compare the quality of the obtained architectures and the efficiency of solving each formulation. Experimental results on standard benchmark NATS-Bench exhibit the advantages of the training-free many-objective evolutionary NAS (TF-MaOENAS) approach in obtaining competitive architectures with reasonable computing cost. The code is available at: https://github.com/ELO-Lab/TF-MaOENAS.
0 Replies
Loading