Can Pruning Improve Certified Robustness of Neural Networks?

Published: 05 Apr 2023, Last Modified: 05 Apr 2023Accepted by TMLREveryoneRevisionsBibTeX
Abstract: With the rapid development of deep learning, the sizes of deep neural networks are getting larger beyond the affordability of hardware platforms. Given the fact that neural networks are often over-parameterized, one effective way to reduce such computational overhead is neural network pruning, by removing redundant parameters from trained neural networks. It has been recently observed that pruning can not only reduce computational overhead but also can improve empirical robustness of deep neural networks (NNs), potentially owing to removing spurious correlations while preserving the predictive accuracies. This paper for the first time demonstrates that pruning can generally improve $L_\infty$ certified robustness for ReLU-based NNs under the \textit{complete verification} setting. Using the popular Branch-and-Bound (BaB) framework, we find that pruning can enhance the estimated bound tightness of certified robustness verification, by alleviating linear relaxation and sub-domain split problems. We empirically verify our findings with off-the-shelf pruning methods and further present a new stability-based pruning method tailored for reducing neuron instability, that outperforms existing pruning methods in enhancing certified robustness. Our experiments show that by appropriately pruning an NN, its certified accuracy can be boosted up to \textbf{8.2\%} under standard training, and up to \textbf{24.5\%} under adversarial training on the CIFAR10 dataset. We additionally observe the possible existence of {\it certified lottery tickets} in our experiments that can match both standard and certified robust accuracies of the original dense models across different datasets. Our findings offer a new angle to study the intriguing interaction between sparsity and robustness, i.e. interpreting the interaction of sparsity and certified robustness via neuron stability. Codes will be fully released.
Submission Length: Long submission (more than 12 pages of main content)
Changes Since Last Submission: Updated revisions according to the reviewers' feedbacks.
Code: https://github.com/VITA-Group/CertifiedPruning
Assigned Action Editor: ~Jasper_Snoek1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 606
Loading