Single Train Multi Deploy on Topology Search Spaces using Kshot-Hypernet

Published: 18 Jun 2024, Last Modified: 08 Jul 2024WANT@ICML 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Efficient Neural Architecture Search, AutoML
Abstract: Neural Architecture Search (NAS) has long been an important research direction, to replace labor-intensive manual architecture search. Since the introduction of weight sharing in NAS, the resource and time consumption of architecture searches has been significantly reduced. In addition, variants of NAS methods have been proposed that eliminate the need for retraining by inferring model parameters directly from the shared weights after the search. However, these methods are mainly based on the MobileNet search space, which is primarily used for size searches. For the important topology search space, no NAS method has been proposed that does not require retraining. In this work, we fill this gap by proposing a NAS method that does not require retraining based on the topology search space. Our method combines the advantages of previously proposed Hypernetwork and Kshot-NAS. We also propose a new distillation and sampling method for this new NAS architecture. We present results on NAS-Bench-201 and show that our method matches or even exceeds the baseline performance of post-search retraining.
Submission Number: 28
Loading