Abstract: Recently, zero-cost proxies for neural architecture search (NAS) have attracted increasing attention. They allow us to discover top-performing neural networks through architecture scoring without requiring training a very large network (i.e., supernet). Thus, it can save significant computation resources to complete the search. However, to our knowledge, no single proxy works best for different tasks and scenarios. To consolidate the strength of different proxies and to reduce search bias, we propose a unified proxy neural architecture search framework (UP-NAS) which learns a multi-proxy estimator for predicting a unified score by combining multiple zero-cost proxies. The predicted score is then used for an efficient gradient-ascent architecture search in the embedding space of the neural network architectures. Our approach can not only save computational time required for multiple proxies during architecture search but also gain the flexibility to consolidate the existing proxies on different tasks. We conduct experiments on the search spaces of NAS-Bench-201 and DARTS in different datasets. The results demonstrate the effectiveness of the proposed approach. Code is available at https://github.com/AI-Application-andIntegration-Lab/UP-NAS.
Loading