Abstract: Neural network pruning has attracted enormous attention since it offers a promising prospect to facilitate the deployment of deep neural networks on resource-limited devices. However, the core of most existing methods lies in the criteria of selection of filters which were pre-defined by researchers. With the advancement of network pruning research, the criteria are becoming increasingly complex. In this paper, we propose a brain-inspired filter pruning algorithm for deep neural networks, which requires no selection criteria. Inspired by the reorganization of brain function in humans when irreversible damage occurs, we treat the weight to be pruning as damaged neurons, and complete the reorganization of the network function in the novel training process proposed in this paper. After pruning, the kept parameters can take over the function of those that have been pruned. The pruning method is widely applicable to common architectures and does not require any artificially designed filter importance measurement functions. As the first attempt on weight-importance irrelevant pruning, BFRIFP provides novel insight into the network pruning problem. Experiments on CIFAR-10 and ImageNet demonstrate the effectiveness of our new perspective of network pruning compared to traditional network pruning algorithms.
0 Replies
Loading