On Iterative Neural Network Pruning, Reinitialization, and the Similarity of Masks

Anonymous

Sep 25, 2019 ICLR 2020 Conference Blind Submission readers: everyone Show Bibtex
  • Keywords: Pruning, Lottery Tickets, Science of Deep Learning, Experimental Deep Learning, Empirical Study
  • TL;DR: Different pruning techniques identify multiple trainable sub-networks within an over-parametrize model, with similar performance but significantly different emergent connectivity structure, weight evolution, and learned functions.
  • Abstract: We examine how recently documented, fundamental phenomena in deep learn-ing models subject to pruning are affected by changes in the pruning procedure. Specifically, we analyze differences in the connectivity structure and learning dynamics of pruned models found through a set of common iterative pruning techniques, to address questions of uniqueness of trainable, high-sparsity sub-networks, and their dependence on the chosen pruning method. In convolutional layers, we document the emergence of structure induced by magnitude-based un-structured pruning in conjunction with weight rewinding that resembles the effects of structured pruning. We also show empirical evidence that weight stability can be automatically achieved through apposite pruning techniques.
  • Code: https://github.com/iclr-8dafb2ab/iterative-pruning-reinit
0 Replies

Loading