Reducing the number of neurons of Deep ReLU Networks based on the current theory of RegularizationDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Blind SubmissionReaders: Everyone
Keywords: Reduction, Compression, Regularization, Theory, Pruning, Deep, Interpretability, Generalization
Abstract: We introduce a new Reduction Algorithm which makes use of the properties of ReLU neurons to reduce significantly the number of neurons in a trained Deep Neural Network. This algorithm is based on the recent theory of implicit and explicit regularization in Deep ReLU Networks from (Maennel et al, 2018) and the authors. We discuss two experiments which illustrate the efficiency of the algorithm to reduce the number of neurons significantly with provably almost no change of the learned function within the training data (and therefore almost no loss in accuracy).
One-sentence Summary: An algorithm which reduces the number of neurons in a Deep ReLU Network and allows several important benefits is presented.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Reviewed Version (pdf): https://openreview.net/references/pdf?id=fOMiZl0bJd
16 Replies

Loading