Expressive Power of Recurrent Spiking Neural Networks for Sequence Modeling

Published: 01 Mar 2026, Last Modified: 01 Mar 2026ICLR 2026 TSALM Workshop PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: spiking neural networks, neuromorphic computing, expressivity, theory
TL;DR: We study the expressivity of recurrent spiking neural networks with sequential data
Abstract: Spiking neural networks (SNNs) provide a biologically inspired and potentially efficient framework for processing sequential data, but their expressive power, particularly in the dynamical setting, remains poorly understood. We establish a universality result showing that recurrent SNNs can approximate a broad class of sequence-to-sequence mappings, thereby providing theoretical support for their use in temporal learning and time series processing.
Track: Research Track (max 4 pages)
Submission Number: 99
Loading