OneSpike: Ultra-low latency spiking neural networks

21 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Spiking Neural Network; Energy efficiency; Model Scaling
Abstract: With the development of deep learning models, there has been growing research interest in spiking neural networks (SNNs) due to their energy efficiency resulting from their multiplier-less nature. The existing methodologies for SNN development include the conversion of artificial neural networks (ANNs) into equivalent SNNs or the emulation of ANNs, with two crucial challenges yet remaining. The first challenge involves preserving the accuracy of the original ANN models during the conversion to SNNs. The second challenge is to run complex SNNs with lower latencies. To solve the problem of high latency while maintaining high accuracy, we proposed a parallel spike-generation (PSG) method to generate all the spikes in a single timestep, while achieving a better model performance than the standard Integrate-and-Fire model. Based on PSG, we propose OneSpike, a highly effective framework that helps to convert any rate-encoded convolutional SNN into one that uses only one timestep without accuracy loss. Our OneSpike model achieves a state-of-the-art (for SNN) accuracy of $81.92\%$ on the ImageNet dataset using just a single time step. To the best of our knowledge, this study is the first to explore converting multi-timestep SNNs into equivalent single-timestep ones, while maintaining accuracy. These results highlight the potential of our approach in addressing the key challenges in SNN research, paving the way for more efficient and accurate SNNs in practical applications.
Supplementary Material: pdf
Primary Area: general machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3309
Loading