On Iterative Neural Network Pruning, Reinitialization, and the Similarity of MasksDownload PDF

25 Sep 2019 (modified: 24 Dec 2019)ICLR 2020 Conference Blind SubmissionReaders: Everyone
  • Original Pdf: pdf
  • Keywords: Pruning, Lottery Tickets, Science of Deep Learning, Experimental Deep Learning, Empirical Study
  • TL;DR: Different pruning techniques identify multiple trainable sub-networks within an over-parametrize model, with similar performance but significantly different emergent connectivity structure, weight evolution, and learned functions.
  • Abstract: We examine how recently documented, fundamental phenomena in deep learn-ing models subject to pruning are affected by changes in the pruning procedure. Specifically, we analyze differences in the connectivity structure and learning dynamics of pruned models found through a set of common iterative pruning techniques, to address questions of uniqueness of trainable, high-sparsity sub-networks, and their dependence on the chosen pruning method. In convolutional layers, we document the emergence of structure induced by magnitude-based un-structured pruning in conjunction with weight rewinding that resembles the effects of structured pruning. We also show empirical evidence that weight stability can be automatically achieved through apposite pruning techniques.
  • Code: https://github.com/iclr-8dafb2ab/iterative-pruning-reinit
8 Replies