Bridge the Gap Between Architecture Spaces via A Cross-Domain PredictorDownload PDF

Published: 31 Oct 2022, Last Modified: 14 Dec 2022NeurIPS 2022 AcceptReaders: Everyone
Keywords: neural architecture search, neural predictor, domain adaptation
TL;DR: We apply domain adaptation techniques to neural predictors and propose a neural predictor for cross-domain prediction that outperforms state-of-the-art methods on benchmark datasets.
Abstract: Neural Architecture Search (NAS) can automatically design promising neural architectures without artificial experience. Though it achieves great success, prohibitively high search cost is required to find a high-performance architecture, which blocks its practical implementation. Neural predictor can directly evaluate the performance of neural networks based on their architectures and thereby save much budget. However, existing neural predictors require substantial annotated architectures trained from scratch, which still consume many computational resources. To solve this issue, we propose a Cross-Domain Predictor (CDP), which is trained based on the existing NAS benchmark datasets (e.g., NAS-Bench-101), but can be used to find high-performance architectures in large-scale search spaces. Particularly, we propose a progressive subspace adaptation strategy to address the domain discrepancy between the source architecture space and the target space. Considering the large difference between two architecture spaces, an assistant space is developed to smooth the transfer process. Compared with existing NAS methods, the proposed CDP is much more efficient. For example, CDP only requires the search cost of 0.1 GPU Days to find architectures with 76.9% top-1 accuracy on ImageNet and 97.51% on CIFAR-10.
Supplementary Material: zip
14 Replies

Loading