Diversity Based Edge Pruning of Neural Networks Using Determinantal Point ProcessesDownload PDF

Published: 01 Apr 2021, Last Modified: 05 May 2023Neural Compression Workshop @ ICLR 2021Readers: Everyone
Keywords: Neural network pruning, Determinantal point process
TL;DR: Building on these previous work and drawing motivation from synaptic diversity in the brain, we propose a novel diversity-based edge pruning technique for neural networks using DPP
Abstract: Deep learning architectures with huge number of parameters are often compressed using pruning techniques. Two classes of pruning techniques are node pruning and edge pruning. A fairly recent work established that Determinantal Point Process (DPP) based node pruning empirically outperforms competing node pruning methods. However, one prominent appeal of edge pruning over node pruning is the consistent finding in literature is that sparse neural networks (edge pruned) generalize better than dense neural networks (node pruned). Building on these previous work and drawing motivation from synaptic diversity in the brain, we propose a novel diversity-based edge pruning technique for neural networks using DPP. We then empirically show that DPP edge pruning for neural networks outperforms other competing methods (both edge and node) on real data.
1 Reply

Loading