Feature Overlapping: The Computational Redundancy Caused by Repeated Features Across Different Time Steps in SNNs
Keywords: Spiking Neural Network; Transformer; Feature Analysis; Image Classification
Abstract: Spiking neural networks (SNNs) have the potential advantage of building large-scale energy-efficient network. However, the high training cost caused by multiple time steps currently limits the application of SNNs. To address this, we break away from the traditional approach of reducing the number of time steps and investigate feature redundancy between time steps. By jointly unfolding the computational process of SNNs across both temporal and spatial dimensions, we are the first to discover the Feature Overlapping Phenomenon, providing new insights for improving SNNs training paradigms. Our Temporal Differential Decoupling (TDD) method successfully separates dynamic and static features, reducing redundant computations. By transforming the feature space into the differential domain, it addresses the issue of the original computational domain's inability to effectively filter sensitive information. In the differential domain, we propose the Gradient Sensitivity Criterion (GSC), which helps further reduce training costs and avoids the loss of important feature information. This paper introduces the Differential Domain Low-Sparsity Approximation (DDLA) algorithm, which significantly reduces computational resource consumption while maintaining computational accuracy by adjusting the filtering ratio. Experimental results show that we achieved up to an 80.9\% reduction in the number of spikes per timestep and a total spike count reduction of up to 57.8\%, significantly reduce the inference cost of SNNs.
Primary Area: applications to neuroscience & cognitive science
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9600
Loading