Non-Uniform Adversarially Robust PruningDownload PDF

25 Feb 2022, 12:35 (modified: 16 Jul 2022, 13:35)AutoML-Conf 2022 (Main Track)Readers: Everyone
Abstract: Neural networks often are highly redundant and can thus be effectively compressed to a fraction of their initial size using model pruning techniques without harming the overall prediction accuracy. Additionally, pruned networks need to maintain robustness against attacks such as adversarial examples. Recent research on combining all these objectives has shown significant advances using uniform compression strategies, that is, all weights or channels are compressed equally according to a preset compression ratio. In this paper, we show that employing non-uniform compression strategies allows to significantly improve clean data accuracy as well as adversarial robustness under high overall compression. We leverage reinforcement learning for finding an optimal trade-off and demonstrate that the resulting compression strategy can be used as a plug-in replacement for uniform compression ratios of existing state-of-the-art approaches.
Keywords: adversarial robustness, model pruning, Auto-ML
One-sentence Summary: RL-based non-uniform strategy search for adversarially robust pruning
Track: Main track
Reproducibility Checklist: Yes
Broader Impact Statement: Yes
Paper Availability And License: Yes
Code Of Conduct: Yes
Reviewers: Qi Zhao,
Main Paper And Supplementary Material: pdf
Code And Dataset Supplement: zip
Steps For Environmental Footprint Reduction During Development: In particular, we use homogenous class-wise sampling for the validation dataset. That way, it conforms to the data distribution and reduces the consumption in the evaluation step of RL exploration. Additionally, we keep using each GPU card entirely with multiple processes such that the time consumption on each experiment is reduced on average. We have conducted all our experiments on Nvidia RTX-3090 GPU cards and we can maintain a reasonable consumption as our university uses 100% renewable energy. For an objective measurement, we thus estimate the amount of total CO2 emissions of 204.96 kgCO2eq when using the commodity Google Cloud Platform in region europe-west3.
CPU Hours: 0
GPU Hours: 960
TPU Hours: 0
Evaluation Metrics: No
Estimated CO2e Footprint: 204.96
Class Of Approaches: Reinforcement Learning, Adversarial Training, Adversarial Examples
Datasets And Benchmarks: CIFAR-10, ImageNet, SVHN
Performance Metrics: Accuracy, Balanced Accuracy
6 Replies