Keywords: NAS, ranking, search, speedup
TL;DR: Faster architecture ranking to speedup discrete NAS methods.
Abstract: The fundamental problem in Neural Architecture Search (NAS) is to efficiently find high-
performing ones from a search space of architectures. We propose a simple but powerful
method for ranking architectures FEAR in any search space. FEAR leverages the viewpoint
that neural networks are powerful non-linear feature extractors. By training different
architectures in the search space to the same training or validation error and subsequently
comparing the usefulness of the features extracted on the task-dataset of interest by freezing
most of the architecture we obtain quick estimates of the relative performance. We validate
FEAR on Natsbench topology search space on three different datasets against competing
baselines and show strong ranking correlation especially compared to recently proposed
zero-cost methods. FEAR especially excels at ranking high-performance architectures in the
search space. When used in the inner loop of discrete search algorithms like random search,
FEAR can cut down the search time by approximately 2.4x without losing accuracy. We additionally
empirically study very recently proposed zero-cost measures for ranking and find that they
breakdown in ranking performance as training proceeds and also that data-agnostic ranking
scores which ignore the dataset do not generalize across dissimilar datasets.
Ethics Statement: NAS methods can have large carbon footprint. Our research aims to speed up a class of NAS methods. It has been identified that NAS methods can also exacerbate the bias in learning-based models. It is possible that by making NAS methods more accessible before the bias aggravation has been better understood that models especially deployed in real-world production pipelines may become more biased as NAS gains popularity.
Crc Pdf: pdf
Poster Pdf: pdf
Original Version: pdf
4 Replies
Loading