BPNAS: Bayesian Progressive Neural Architecture Search

Published: 27 Jun 2024, Last Modified: 20 Aug 2024Differentiable Almost EverythingEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Network architecture search, differentiable NAS, network ensemble search
TL;DR: We propose a Bayesian probabilistic network architecture search (NAS) method under the differentiable NAS framework, which enables the straightforward generation of architecture samples.
Abstract: In the performance landscape of the multiple NAS benchmarks, only a few operations contribute to higher performance while others have detrimental effects. This motivates tailoring a posterior distribution by imposing a higher prior quantity on a sparser supernetwork to progressively prune unimportant operations. Moreover, the Bayesian scheme enables the straightforward generation of architecture samples when provided with an estimated architecture from any NAS method. To that end, we propose **BPNAS**, a Bayesian progressive neural architecture search (NAS) method under the differentiable NAS framework that combines recent advances in the differentiable NAS framework with Bayesian inference adopting sparse prior on network architecture for faster convergence and uncertainty quantification in architecture search. With numerical experiments on the popular NAS search space, we show that **BPNAS** improves the accuracy and convergence speed compared to state-of-the-art NAS approaches on benchmark datasets.
Submission Number: 4
Loading