SparseProp: Efficient Event-Based Simulation and Training of Sparse Recurrent Spiking Neural Networks

Published: 21 Sept 2023, Last Modified: 17 Jan 2024NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: spiking networks, event-based simulation, sparse networks, backpropagation, algorithm, neuroscience
TL;DR: We propose a novel event-based algorithm to simulate and train spiking neural networks, reducing computational cost from N to log(N) per network spike for sparse spiking networks.
Abstract: Spiking Neural Networks (SNNs) are biologically-inspired models that are capable of processing information in streams of action potentials. However, simulating and training SNNs is computationally expensive due to the need to solve large systems of coupled differential equations. In this paper, we propose a novel event-based algorithm called SparseProp for simulating and training sparse SNNs. Our algorithm reduces the computational cost of both forward pass and backward pass operations from O(N) to O(log(N)) per network spike, enabling numerically exact simulations of large spiking networks and their efficient training using backpropagation through time. By exploiting the sparsity of the network, SparseProp avoids iterating through all neurons at every spike and uses efficient state updates. We demonstrate the effectiveness of SparseProp for several classical integrate-and-fire neuron models, including simulating a sparse SNN with one million LIF neurons, which is sped up by more than four orders of magnitude compared to previous implementations. Our work provides an efficient and exact solution for training large-scale spiking neural networks and opens up new possibilities for building more sophisticated brain-inspired models.
Supplementary Material: pdf
Submission Number: 1464
Loading