Exploring Complicated Search Spaces With Interleaving-Free Sampling

Published: 2025, Last Modified: 07 Jan 2026IEEE Trans. Neural Networks Learn. Syst. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Conventional neural architecture search (NAS) algorithms typically work on search spaces with short-distance node connections. We argue that such designs, though safe and stable, are obstacles to exploring more effective network architectures. In this brief, we explore the search algorithm upon a complicated search space with long-distance connections and show that existing weight-sharing search algorithms fail due to the existence of interleaved connections (ICs). Based on the observation, we present a simple-yet-effective algorithm, termed interleaving-free neural architecture search (IF-NAS). We further design a periodic sampling strategy to construct subnetworks during the search procedure, avoiding the ICs to emerge in any of them. In the proposed search space, IF-NAS outperforms both random sampling and previous weight-sharing search algorithms by significant margins. It can also be well-generalized to the microcell-based spaces. This study emphasizes the importance of macrostructure and we look forward to further efforts in this direction. The code is available at github.com/sunsmarterjie/IFNAS.
Loading