Temporal Misinformation and Conversion through Probabilistic Spiking Neurons

27 Sept 2024 (modified: 05 Feb 2025)Submitted to ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: spiking neural networks, probabilistic spiking, ANN-SNN conversion
TL;DR: A novel method for ANN-SNN conversion through probabilistic spiking neurons
Abstract: In the age of large neural network models and their high energy demand, Spiking Neural Networks (SNNs) offer a compelling alternative to Artificial Neural Networks (ANNs) due to their energy efficiency and resemblance to biological brains. However, directly training SNNs with spatio-temporal backpropagation remains challenging due to their discrete signal processing and temporal dynamics. Alternative methods, notably ANN-SNN conversion, have enabled SNNs to achieve performance in various machine learning tasks, comparable to ANNs, but often to the expense of long latency needed to achieve such performance, especially on large scale complex datasets. The present work deals with ANN-SNN setting and identifies a new phenomenon we term ``temporal misinformation'', where random spike rearrangement through time in the converted SNN model improves its performance. To account for this, we propose bio-plausible, two-phase probabilistic (TPP) spiking neurons to be used in ANN-SNN conversion. We showcase the benefits of our proposed methods both theoretically and empirically through extensive experiments on CIFAR-10/100 and a large-scale dataset ImageNet over a variety of architectures, reaching SOTA performance. Code is available on GitHub.
Primary Area: learning theory
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 11224
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview