Evolutionary NetArchitecture Search for Deep Neural Networks PruningOpen Website

Published: 01 Jan 2019, Last Modified: 10 May 2023ACAI 2019Readers: Everyone
Abstract: Network pruning is an architecture search process to determine the state (remove/remain) of neurons in the network. It is a com- binatorial optimization problem, and this combinatorial optimiza- tion problem is NP-hard. Most existing pruning methods prune channels/neurons based on the assumption that they are indepen- dent in network. However, there exists dependency among chan- nels/neurons. We try to solve the combinatorial optimization problem by evolutionary algorithm (EA). However, the traditional EA can't be used directly into deep neural networks (DNNs) because the problem dimension is too high. Attention mechanism (AM) can help us get parameter important score to reduce prob- lem difficulty, making the architecture search process more effective. Therefore, combining EA and AM, we propose an Evolutionary NetArchitecture Search (EvoNAS) method to solve network pruning problem. We demonstrate the effectiveness of our method on common datasets with ResNet, ResNeXt, and VGG. For example, for ResNet on CIFAR-10, EvoNAS reduces 73.40% computing operations and 73.95% parameters with 0.13% test accuracy increasement. Compared with the state-of-the-art methods, EvoNAS increases 30% reduction ratio at least.
0 Replies

Loading