Speeding up NAS with Adaptive Subset SelectionDownload PDF

Published: 16 May 2022, Last Modified: 05 May 2023AutoML 2022 (Late-Breaking Workshop)Readers: Everyone
Abstract: The majority of recent developments in neural architecture search (NAS) have been aimed at decreasing the computational cost of various techniques without affecting their final performance. Towards this direction, many low-fidelity and performance prediction methods have been considered, including using subsets of the training data. In this work, we initiate the study of *adaptive* subset selection for NAS and present it as complementary to state-of-the-art NAS approaches. We uncover a natural connection between one-shot NAS algorithms and adaptive subset selection and devise an algorithm that makes use of state-of-the-art techniques from both areas. We use these techniques to substantially reduce the runtime of DARTS-PT, a leading one-shot NAS algorithm, without sacrificing accuracy. Our results are consistent across multiple datasets, and our code and all materials needed to reproduce our results are available at https://anonymous.4open.science/r/SubsetSelection_NAS-2BE4.
Keywords: Deep Learning, Neural Architecture Search, Subset Selection
One-sentence Summary: We use techniques from adaptive subset selection to substantially reduce the runtime of one-shot NAS without sacrificing accuracy.
Reproducibility Checklist: Yes
Broader Impact Statement: Yes
Paper Availability And License: Yes
Code Of Conduct: Yes
Reviewers: Colin White, crwhite@cs.cmu.edu
Main Paper And Supplementary Material: pdf
Code And Dataset Supplement: https://anonymous.4open.science/r/SubsetSelection_NAS-2BE4/
1 Reply

Loading