Abstract: Neural architecture search (NAS) based on zero-shot proxy evaluation assesses neural networks using mathematical metrics without requiring actual network training, resulting in the fastest evaluation speed in the NAS field and can greatly reduce the cost of neural network development on portable environment. However, the commonly used evolutionary-based search strategies in such algorithms often suffer from slow convergence and require numerous iterations, limiting their performance advantages. In this paper, we demonstrated diversity within the population and the number of iterations for dominant individuals as key efficiency factors in Evolutionary NAS tasks. Based on this, we propose a novel evolutionary algorithm-based lightweight neural architecture search method called RetNAS. RetNAS uses a low-cost method to explicitly model the probability distribution of operators and hyperparameters of dominant individuals in neural architecture search, thereby guiding the random architecture search process. It accelerates convergence speed and improves search results through a gradual mixing and iterative process. Experimental results show that compared to baseline algorithms, RetNAS achieves equivalent evaluation metrics with only 50\% of the search steps, substantially expediting the search process. Furthermore, RetNAS can be easily integrated into other NAS algorithms, providing similar performance improvements.
Loading