Neuron-Level Architecture Search for Efficient Model Design
Keywords: Neural Architecture Search, AutoML, Evolutionary Algorithms, Computational Efficiency
TL;DR: NAS method that designs architectures at the neuron level, balancing computational cost and model performance through evolutionary search.
Abstract: Neural Architecture Search (NAS) methods are used to automate the design of neural network architectures. On many popular datasets, architectures discovered through NAS already outperform manually designed ones. However, most existing approaches rely on predefined sets of common operations, such as fully connected or convolutional layers with a fixed set of their parameters. As a result, these methods are limited in their ability to discover fundamentally new architectures with complex topologies. In this work, we propose an approach that designs architectures at the most fundamental level—by directly manipulating neurons and their connections. In addition, we focus on finding computationally efficient models by eliminating redundant components that may be inherent in manually constructed layers. To achieve this, we introduce an evolutionary algorithm that balances computational cost with architecture performance. Experimental results demonstrate that our approach can discover architectures that match the performance of manually designed networks while reducing FLOPs by more than a factor of three on image classification datasets.
Submission Number: 23
Loading