DensEMANN: How to Automatically Generate an Efficient while Compact DenseNet

07 May 2023 (modified: 12 Dec 2023)Submitted to NeurIPS 2023EveryoneRevisionsBibTeX
Keywords: neural network, convolutional neural network, neural architecture search, NAS, neural architecture optimization, constructive algorithms, growing, pruning, DenseNet, EMANN, in-supervised, self-structuring, connection strength
TL;DR: NAS algorithm based on finite-state machine simultaneously grows and trains tiny DenseNet with very good accuracy on various benchmarks
Abstract: We present a new and improved version of DensEMANN, an algorithm that grows small DenseNet architectures virtually from scratch while simultaneously training them on target data. Following a finite-state machine based on the network's accuracy and the evolution of its weight values, the algorithm adds and prunes dense layers and convolution filters during training only when this leads to significant accuracy improvement. We show that our improved version of DensEMANN can quickly and efficiently search for small and competitive DenseNet architectures for well-known image classification benchmarks. In half a GPU day or less, this method generates networks with under 500k parameters and between 93% and 95% accuracy on various benchmarks (CIFAR-10, Fashion-MNIST, SVHN). For CIFAR-10, we show that it comes very close to the state-of-the-art Pareto front between accuracy and size, finding networks with 98.84% of the accuracy and 98.08% of the size of the closest Pareto-optimal competitor, in only 0.70% of the search time it took to find that competitor. We also show that DensEMANN generates its networks with optimal weight values, and identify a simple mechanism that allows it to generate such optimal weights. All in all, we show this "in-supervised" essentially incremental approach to be promising for a fast design of competitive while compact convolution networks.
Supplementary Material: zip
Submission Number: 2609
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview