MRT-NAS: Boosting Training-Free NAS via Manifold Regularization

Published: 2025, Last Modified: 16 Jan 2026ICANN (1) 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Training-free Neural Architecture Search (NAS) aims to automatically discover high-performing neural networks using zero-cost proxies, which directly predict the network’s performance and avoid the resource-intensive training process. In this paper, we observe that existing zero-cost proxies prefer to high complexity (i.e., Param, FLOPs) networks that are challenging to optimize, which decreases the performance of training-free NAS. Although the skip connection can alleviate the optimization difficulties and reduce the complexity level, it faces significant disadvantages in proxy scoring. Therefore, to address the performance collapse issue, we propose the Manifold Regularization for Training-free NAS (MRT-NAS) that improves the proxy’s identification ability of the skip connection structure by measuring the similarity between the network’s input and output manifold. Notably, MRT-NAS can be used to regularize any zero-cost proxies in a plug-and-play manner. Experimental results across 3 search spaces and 5 real-world tasks validate the effectiveness of MRT-NAS on boosting the performance of all given zero-cost proxies with negligible time cost. Our implementation is available at https://github.com/yoshimatsuu/MRT-NAS
Loading