Pruning Neural Networks via Coresets and Convex Geometry: Towards No AssumptionsDownload PDF

Published: 31 Oct 2022, Last Modified: 09 Oct 2022NeurIPS 2022 AcceptReaders: Everyone
Keywords: Coresets, Convex Geometry, Neural Network Pruning
TL;DR: A neural network pruning procedure based on combining coresets and convex geometry. Such method enables us in reducing the number of assumptions required by previous coreset based pruning methods.
Abstract: Pruning is one of the predominant approaches for compressing deep neural networks (DNNs). Lately, coresets (provable data summarizations) were leveraged for pruning DNNs, adding the advantage of theoretical guarantees on the trade-off between the compression rate and the approximation error. However, coresets in this domain were either data dependant or generated under restrictive assumptions on both the model's weights and inputs. In real-world scenarios, such assumptions are rarely satisfied, limiting the applicability of coresets. To this end, we suggest a novel and robust framework for computing such coresets under mild assumptions on the model's weights and without any assumption on the training data. The idea is to compute the importance of each neuron in each layer with respect to the output of the following layer. This is achieved by an elegant combination of L\"{o}wner ellipsoid and Caratheodory theorem. Our method is simultaneously data-independent, applicable to various networks and datasets (due to the simplified assumptions), and theoretically supported. Experimental results show that our method outperforms existing coreset based neural pruning approaches across a wide range of networks and datasets. For example, our method achieved a $62\%$ compression rate on ResNet50 on ImageNet with $1.09\%$ drop in accuracy.
Supplementary Material: pdf
28 Replies

Loading