Abstract: Neural Architecture Search (NAS) has been extensively studied due to its ability in automatic architecture engineering. Existing NAS methods rely heavily on the gradients and data labels, which either incur immense computational costs or suffer from discretization discrepancy due to the supernet structure. Moreover, the majority of them are limited in generating diverse architectures. To alleviate these issues, in this paper, we propose a novel zero-cost proxy called $\mathsf {MeCo}$ based on the Pearson correlation matrix of the feature maps. Unlike the previous work, the computation of $\mathsf {MeCo}$ as well as its variant $\mathsf {MeCo_{opt}}$ requires only one random data for a single forward pass. Based on the proposed zero-cost proxy, we further craft a new zero-shot NAS scheme called $\mathsf {FLASH}$, which harnesses a new proxy-based operation scoring function and a greedy heuristic. Compared to the existing methods, $\mathsf {FLASH}$ is highly efficient and can construct diverse model architectures instead of repeated cells. We design comprehensive experiments and extensively evaluate our designs on multiple benchmarks and datasets. The experimental results show that our method is one to six orders of magnitudes more efficient than the state-of-the-art baselines with the highest model accuracy.
External IDs:dblp:journals/pami/JiangWBY25
Loading