A Simple yet Effective Method to Prune Dense Layers of Neural Networks

Mohammad Babaeizadeh, Paris Smaragdis, Roy H. Campbell

Nov 04, 2016 (modified: Nov 04, 2016) ICLR 2017 conference submission readers: everyone
  • Abstract: Neural networks are usually over-parameterized with significant redundancy in the number of required neurons which results in unnecessary computation and memory usage at inference time. One common approach to address this issue is to prune these big networks by removing extra neurons and parameters while maintaining the accuracy. In this paper, we propose NoiseOut, a fully automated pruning algorithm based on the correlation between activations of neurons in the hidden layers. We prove that adding additional output neurons with entirely random targets results into a higher correlation between neurons which makes pruning by NoiseOut even more efficient. Finally, we test our method on various networks and datasets. These experiments exhibit high pruning rates while maintaining the accuracy of the original network.
  • TL;DR: Pruning neural networks by adding output neurons with fully random targets and removing strongly correlated neurons.
  • Keywords: Deep learning
  • Conflicts: illinois.edu

Loading