Model-Free Energy Distance for Pruning DNNsDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Pruning, Residual Networks, Structured Method, Energy Distance
Abstract: We propose a novel method for compressing Deep Neural Networks (DNNs) with competitive performance to state-of-the-art methods. We measure a new model-free information between the feature maps and the output of the network. Model-freeness of our information measure guarantees that no parametric assumptions on the feature distribution are required. The new model-free information is subsequently used to prune a collection of redundant layers in the networks with skip-connections. Numerical experiments on CIFAR-10/100, SVHN, Tiny ImageNet, and ImageNet data sets show the efficacy of the proposed approach in compressing deep models. For instance, in classifying CIFAR-10 images our method achieves respectively 64.50% and 60.31% reduction in the number of parameters and FLOPs for a full DenseNet model with 0.77 million parameters while dropping only 1% in the test accuracy. Our code is available at https://github.com/suuyawu/PEDmodelcompression
One-sentence Summary: We measure the importance of a set of layers in DNNs with skip-connections using a model-free distance and use them for pruning the redundant layers.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Supplementary Material: zip
Reviewed Version (pdf): https://openreview.net/references/pdf?id=z5Dwef9ehz
9 Replies

Loading