Automatic Generation of Neural Architecture Search SpacesDownload PDF

Published: 19 Jan 2022, Last Modified: 05 May 2023CLeaR-Workshop PosterReaders: Everyone
Keywords: Neural architecture generation, constraint programming, classification
TL;DR: Generation of uniformly distinct neural architectures, with a formula based on valid padding convolutions, using SAT and SMT techniques.
Abstract: Neural Architecture Search (NAS) is receiving growing attention as the need to remove the human bias from neural network models rises. There is extensive research in trying to beat state-of-the-art NAS algorithms. However, these advances do not focus directly on the search space these algorithms explore. Here, we propose a framework that encodes the structure of a convolutional neural network, respecting the arithmetical relation of the kernel and stride sizes with the input and output shapes. This framework consists of a formula with constraints that, if given the structure of the problem (input and output shapes), can produce specification properties of a neural architecture through a solver. We show that this methodology can assemble networks with arbitrary sizes and structures, that make for unique and uniform search spaces. To compare the resulting architectures, a metric that computes dissimilarity in terms of architectural structure is proposed. We empirically show that generating dissimilar architectures, implies dissimilarities in performance. In accordance, similar architectures are similar in performance.
0 Replies

Loading