Shapley Oracle Pruning for Convolutional Neural Networks

TMLR Paper402 Authors

01 Sept 2022 (modified: 17 Sept 2024)Rejected by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: The recent hardware and algorithmic developments leverage convolutional neural networks to considerable sizes. The performance of neural networks relies then on the interplay of an even larger pool of, possibly correlated and redundant, parameters, huddled in convolutional channels or residual blocks. To this end, we propose a game-theoretic approach based on the Shapley value, which, accounting for neuron synergies, computes the average contribution of a neuron. A significant feature of the method is that it incorporates oracle pruning, the ideal configuration of a compressed network, to build a unique ranking of nodes that satisfy a range of normative criteria. The ranking enables to select top parameters in the network and remove trailing ones, thus creating a smaller and better interpretable model. As applying the Shapley value to numerous neurons is computationally challenging, we introduce three tractable approximations to handle large models and provide pruning in a reasonable time. The experiments show that the proposed normative ranking and its approximations show practical results, obtaining state-of-the-art network compression. The code is available at https://anonymous.4open.science/r/shapley_oracle_pruning1/.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Yunhe_Wang1
Submission Number: 402
Loading