Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Model Compression, Structured Pruning, Hashing, CNNs
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We propose a data-free and dynamic CNN compression method that does not require any training or fine-tuning.
Abstract: To reduce the computational cost of convolutional neural networks (CNNs) for usage on resource-constrained devices, structured pruning approaches have shown promising results, drastically reducing floating-point operations (FLOPs) without substantial drops in accuracy.
However, most recent methods require fine-tuning or specific training procedures to achieve a reasonable trade-off between retained accuracy and reduction in FLOPs. This introduces additional cost in the form of computational overhead and requires training data to be available.
To this end, we propose HASTE ($\textbf{Has}$hing for $\textbf{T}$ractable $\textbf{E}$fficiency), a parameter-free and data-free module that acts as a plug-and-play replacement for any regular convolution module. It instantly reduces the network’s test-time inference cost without requiring any training or fine-tuning.
We are able to drastically compress latent feature maps without sacrificing much accuracy by using locality-sensitive hashing (LSH) to detect redundancies in the channel dimension. Similar channels are aggregated to reduce the input and filter depth simultaneously, allowing for cheaper convolutions.
We demonstrate our approach on the popular vision benchmarks CIFAR-10 and ImageNet.
In particular, we are able to instantly drop 46.72\% of FLOPs while only losing 1.25\% accuracy by just swapping the convolution modules in a ResNet34 on CIFAR-10 for our HASTE module.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3491
Loading