MC-DARTS : Model Size Constrained Differentiable Architecture SearchDownload PDF

Published: 20 Oct 2022, Last Modified: 05 May 2023HITY Workshop NeurIPS 2022Readers: Everyone
Keywords: Deep Learning, AutoML, Neural Architecture Search, Constrained Optimization
TL;DR: We propose a novel approach, known as model size constrained DARTS can efficiently search for network architectures.
Abstract: Recently, extensive research has been conducted on automated machine learning(AutoML). Neural architecture search (NAS) in AutoML is a crucial method for automatically optimizing neural network architectures according to applying data and its usage. One of the prospected ways to search for a high accuracy model is the gradient method NAS, known as differentiable architecture search (DARTS). Previous DARTS-based studies have proposed that the size of the optimal architecture depends on the size of the dataset. If the optimal size of the architecture is small, the search for a large model size architecture is unnecessary. The size of the architectures must be considered when deep learning is used on mobile devices and embedded systems since the memory on these platforms is limited. Therefore, in this paper, we propose a novel approach, known as model size constrained DARTS. The proposed approach adds constraints to DARTS to search for a network architecture, considering the accuracy and the model size. As a result, the proposed method can efficiently search for network architectures with short training time and high accuracy under constrained conditions.
4 Replies

Loading