Enhancing Training-Free Multi-Objective Pruning-Based Neural Architecture Search with Low-Cost Local Search

Published: 01 Jan 2023, Last Modified: 02 Aug 2025RIVF 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Multi-Objective Neural Architecture Search (MONAS) aims to discover network architectures that optimize not only the performance but also other aspects such as the number of network parameters or latency. MONAS problems are typically solved with multi-objective evolutionary algorithms (MOEAs). The recently introduced Training-Free Multi-Objective Pruning-Based Neural Architecture Search (TF-MOPNAS) has demonstrated its comparable performance to MOEAs. The effectiveness of TF-MOPNAS, however, has not been thoroughly evaluated. This study first evaluates the performance of TF-MOPNAS by assessing the quality of approximation fronts found by TF -MOPNAS using the Hypervolume indicator, which is commonly used for evaluating multi-objective algorithms in practice. We then introduce a low-cost local search method namely Training-Free Local Search (TF-LS) to improve the efficacy of TF-MOPNAS. This method employs the training-free Synaptic Flow metric to ensure that the improvement of the quality of approximation fronts is conducted at a trivial cost. The experimental results on two widely-used NAS search spaces, NAS-Bench-101 and NAS-Bench-201, demonstrate that integrating TF-LS with TF-MOPNAS can improve the Hypervolume value. Moreover, this combination outperforms other state-of-the-art NAS methods when running with a limited computation budget.
Loading