Enabling Probabilistic Inference on Large-Scale Spiking Neural NetworksDownload PDF

22 Sept 2022 (modified: 13 Feb 2023)ICLR 2023 Conference Withdrawn SubmissionReaders: Everyone
Keywords: spiking neural networks, SNNs
Abstract: Deep spiking neural networks have achieved success in many machine learning tasks. However, most existing works consider deterministic SNNs, which ignore the inherent randomness of neurons. On the other hand, existing works on stochastic SNNs are limited to small networks and are hard to scale to larger SNN topologies. We introduce Noisy SNNs (NSNNs), built upon a stochastic noisy LIF neuron model to enable probabilistic inference on large-scale SNN topologies. By viewing NSNN as a Bayesian Network, we derive a three-factor learning rule called noise-driven learning (NDL) for synaptic optimization. The post-synaptic factor in NDL is obtained using the neuronal membrane noise statistics, avoiding the problematic derivative of the Heaviside spiking function and providing an explanation for surrogate gradients from the standpoint of random noise. NDL is backpropagation-compatible, enabling NSNNs to be extended to any SNN topology through modular replacement (Codes are available at https://cutt.ly/9CxT5jI). Evaluations on CIFAR-10/100 and DVS-CIFAR show that NSNNs achieve competitive performance in clean test scenarios. Furthermore, NSNNs exhibit high robustness against challenging perturbations like adversarial perturbation and spike-level disturbance.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Neuroscience and Cognitive Science (e.g., neural coding, brain-computer interfaces)
27 Replies

Loading