An Efficient Structural Pruning for Spiking Neural Networks by Balancing Accuracy and Sparsification
Keywords: spiking neural networks
Abstract: The increasing scale of spiking neural networks (SNNs) poses significant challenges for deployment on resource-constrained neuromorphic hardware, necessitating lightweight and learnable structural solutions. Interestingly, biological neural systems employ an efficient organizational strategy—hierarchical structural reorganization around functional clusters, where new connections grow
orthogonally to existing ones to expand representational capacity. Inspired by this mechanism, we propose a dynamic pruning and regrowth framework with channel-level orthogonality for SNNs (DPRC-SNNs) to enable scalable and efficient structural learning for SNNs. DPRC-SNNs introduce the spiking column subset selection mechanism for SNNs, which integrates channel-level pruning with orthogonality-driven regrowth, selectively restoring diverse and complementary channels to minimize information loss from aggressive pruning. Through iteratively pruning redundant channels and regrowing orthogonal ones, DPRC-SNNs preserve functional diversity while enhancing sparsity at the channel level. Extensive evaluations on CIFAR10, DVS-Gesture, and DVS-CIFAR10 demonstrate that DPRC-SNNs achieve high compression rates and computational efficiency without compromising accuracy, showing strong potential for neuromorphic deployment.
Primary Area: applications to neuroscience & cognitive science
Submission Number: 8634
Loading