Exploring the Limitations of Layer Synchronization in Spiking Neural Networks

Published: 03 Sept 2025, Last Modified: 03 Sept 2025Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Neural-network processing in machine learning applications relies on layer synchronization. This is practiced even in artificial Spiking Neural Networks (SNNs), which are touted as consistent with neurobiology, in spite of processing in the brain being in fact asynchronous. A truly asynchronous system however would allow all neurons to evaluate concurrently their threshold and emit spikes upon receiving any presynaptic current. Omitting layer synchronization is potentially beneficial, for latency and energy efficiency, but asynchronous execution of models previously trained with layer synchronization may entail a mismatch in network dynamics and performance. We present and quantify this problem, and show that models trained with layer synchronization either perform poorly in absence of the synchronization, or fail to benefit from any energy and latency reduction, when such a mechanism is in place. We then explore a potential solution direction, based on a generalization of backpropagation-based training that integrates knowledge about an asynchronous execution scheduling strategy, for learning models suitable for asynchronous processing. We experiment with 2 asynchronous neuron execution scheduling strategies in datasets that encode spatial and temporal information, and we show the potential of asynchronous processing to use less spikes (up to 50\%), complete inference faster (up to 2x), and achieve competitive or even better accuracy (up to $\sim$10\% higher). Our exploration affirms that asynchronous event-based AI processing can be indeed more efficient, but we need to rethink how we train our SNN models to benefit from it. (Source code available at: \url{https://github.com/RoelMK/asynctorch})
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: We have the following changes in preparation of the camera ready version: - We adopted the CR template of TMLR - We inserted the complete author information - We provide an open access code repository URL with the code backing the paper results at the end of the abstract. - We updated the citations to reflect all discussions with the reviewers. - We added an acknowledgments section to cite our funding sources - We fixed some highlighted issues with Fig 1 - We removed the color coding that was in place in the last revision to highlight where/how we addressed the reviewers comments and requests (ref https://openreview.net/forum?id=mfmAVwtMIk&noteId=T7GqfdRbDZ). The current manuscript has all review comments addressed.
Code: https://github.com/RoelMK/asynctorch
Supplementary Material: pdf
Assigned Action Editor: ~Blake_Aaron_Richards1
Submission Number: 4393
Loading