Efficient One-Shot Neural Architecture Search With Progressive Choice Freezing Evolutionary SearchDownload PDF

Published: 01 Feb 2023, Last Modified: 13 Feb 2023Submitted to ICLR 2023Readers: Everyone
Abstract: Neural Architecture Search (NAS) is a fast-developing research field to promote automatic machine learning. Among the recently populated NAS methods, One-Shot NAS has attracted significant attention since it greatly reduces the training cost compared with the previous NAS methods. In One-Shot NAS, the best network architecture is searched within a supernet, which is trained only once. In practice, the search process involves numerous inference processes for each user case, which causes high overhead in terms of latency and energy consumption. To tackle this problem, we first observe that the choices of the first few blocks that belong to different candidate networks will become similar at the early search stage. Furthermore, these choices are already close to the optimal choices obtained at the end of the search. Leveraging this interesting feature, we propose a Progressive Choice Freezing Evolutionary Search (PCF-ES) method that gradually freezes block choices for all subnets at different search generations. This approach gives us an opportunity to reuse intermediate data produced by the frozen block instead of re-computing them. The experiment results show that the proposed PCF-ES provides up to 55\% speedup and reduces energy consumption by 51\% during the searching stage.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: General Machine Learning (ie none of the above)
11 Replies

Loading