Data-free pruning of CNN using kernel similarity

Published: 01 Jan 2025, Last Modified: 18 Apr 2025Multim. Syst. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Channel pruning can effectively compress Convolutional Neural Networks (CNNs) for deployment on edge devices. Most existing pruning methods are data-driven, relying heavily on datasets and necessitating fine-tuning the pruned models for several epochs. However, data privacy protection increases the difficulty of getting a dataset, making data inaccessible in some scenarios. Inaccessible datasets lead to current pruning methods infeasible. To solve this issue, we propose a fine-grained data-free CNN pruning method that does not require data. It involves filter reconstruction and feature reconstruction. To reduce kernels in each filter, we group the kernels in each filter based on the similarity of kernels and calculate a representative kernel for each group to reconstruct the filters. During inference, we conduct feature reconstruction to match input channels of reconstructed filter so as to satisfy the operational criteria of convolutional neural networks. We validate the effectiveness of our method through extensive experiments using ResNet, MobileNet, and VGG on CIFAR and ImageNet datasets. For MobileNet-V2, we obtain FLOPs reduction of 53.2% with only Top-1 accuracy reduction of 0.64% without fine-tuning on ImageNet.
Loading