Mini-NAS: A Neural Architecture Search Framework for Small Scale Image Classification ApplicationsDownload PDF

Dec 21, 2020 (edited Feb 26, 2021)tinyML 2021 Research Symposium Blind SubmissionReaders: Everyone
  • Keywords: TinyML Datasets, Neural Architecture Search, Search Spaces, Tiny AutoML, Image Classification, Convolutional Networks
  • TL;DR: This work proposes a suit 30 of small scale image classification datasets and a NAS framework that can discover high accuracy high efficiency networks for various datasets including CIFAR-10.
  • Abstract: Neural architecture search (NAS) has shown promising results on image classification datasets such as CIFAR-10 and ImageNet. The desire for higher accuracy coupled with the need for computationally affordable NAS, solely for these benchmarks however, has had a profound effect on the design of NAS search spaces and algorithms. Many real world use cases on the other hand, may not always come with datasets as large as ImageNet or even CIFAR-10 and the required network sizes may only be a few hundred KBs, therefore, the optimizations done to speed up NAS may not be ideal for these. For instance, modular search spaces reduce search complexity as compared to global ones but offer only partial network discovery and a fine grain control over network efficiency is lost. Similarly, a transition from algorithms searching in discrete search spaces to continuous ones brings significant efficiency gains but reward signals in the former provide more confident search directions. In this work, we first present a suit of 30 image classification datasets that mimics possible real world use cases. Next, we present a powerful yet minimal global search space that contains all vital ingredients to create structurally diverse still parameter efficient networks. Lastly, we propose an algorithm that can efficiently navigate a huge discrete search space and is specifically tailored for discovering high accuracy, low complexity tiny convolution networks. The proposed NAS system, Mini-NAS, on average, discovers 14.7x more parameter efficient networks for 30 datasets as compared to MobileNetV2 while achieving on par accuracy. On CIFAR-10, Mini-NAS discovers a model that is 2.3x, 1.9x and 1.2x smaller than the smallest models discovered by RL, gradient-based and evolutionary NAS methods respectively while the search cost is only 2.4 days.
4 Replies