A Compact Neural Architecture Search for Accelerating Image Classification ModelsDownload PDFOpen Website

Published: 01 Jan 2021, Last Modified: 12 May 2023ICTC 2021Readers: Everyone
Abstract: Nowadays, Automated Machine Learning (AutoML) has gradually become an inevitable trend providing automatic and suitable solutions to address AI tasks without needing more efforts from experts. Neural Architecture Search (NAS), a subfield of AutoML, has generated automated models solving fundamental problems in computer vision such as image recognition, objects detection. NAS with differentiable search strategies has reduced significantly the GPU time that occupancy on calculation. In this paper, we present an effective algorithm that allows expanding search spaces by selecting operation candidates from the initial set with different ways in concurrent execution. The extended search space makes NAS having more opportunities to find good architectures simultaneously by running the group of search spaces in overlapping time periods instead of sequentially. Our approach, is called Accelerated NAS, shortens 1.8x searching time when comparing to previous works. In addition, the Accelerated NAS generates potential neural architectures having comparable performances with the low inference time.
0 Replies

Loading