Multi-Objective Evolutionary Search of Compact Convolutional Neural Networks with Training-Free Estimation
Abstract: With the increasing demand of deploying convolutional neural networks (CNNs) on resource-constrained devices, designing high-performance and lightweight architectures has become a main challenge for neural architecture search (NAS). This paper develops an evolutionary multi-objective optimization framework to explore CNNs with different compactness in a flexible way. A multi-scale convolutional module is developed to enhance the feature learning capability. To further improve the architecture search efficiency, a low-cost metric based on neural tangent kernel is leveraged to estimate the trainability of CNNs instead of performing an expensive training process. Experiments are carried out on CIFAR-10 and CIFAR-100, to verify the effectiveness of the proposed method. Compared with the state-of-the-art algorithms, the proposed method discovers architectures with a smaller number of parameters and competitive classification performance using only up to 0.2 GPU days, showing a better trade-off between accuracy and model complexity.
0 Replies
Loading