CoPruning: Exploring the Parameter-Gradient Nonlinear Correlation for Neural Network Pruning Using Copula Function
Keywords: Neural Network Pruning; Copula Function; Copula Entropy; Sparse Models; Joint Distribution Model;
TL;DR: Exploiting the Relationship Between Parameters and Gradients via Copula Function for Neural Network Pruning
Abstract: The sheer size of modern neural networks necessitates pruning techniques to overcome the significant computational challenges posed by model serving.
However, existing pruning techniques fail to capture the nonlinear correlation between parameters and gradient, which is crucial in the pruning process, thus leading to low accuracy under high sparsity.
In this work, we propose CoPruning, a new pruning framework, which uses a copula function based
joint distribution model that precisely captures the intricate nonlinear correlation between parameters and gradient, enabling more insightful pruning decisions.
Additionally, we integrate a local optimization approach within CoPruning to better capture relative change in parameters within their local context, providing new metrics for achieving finer-grained optimization.
Extensive experiments on various networks reveal CoPruning's comparable performance to state-of-the-art (SoTA) pruning algorithms.
CoPruning outperforms the SoTA with 3.09%, 1.87%, and 2.19% higher accuracy on MLPNet, ResNet20, and ResNet50 at 0.98 sparsity, respectively, and 10.43% higher accuracy on MobileNetV1 at 0.9 sparsity on ImageNet.
Supplementary Material: zip
Primary Area: optimization
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3720
Loading