Keywords: spiking neural networks, long range dependencies, event data modelling, hippo matrix, state space models
TL;DR: We introduce SPLR, a novel spiking neural network architecture that effectively captures long-range temporal dependencies through Spike-Aware HiPPO and dendrite attention.
Abstract: Spiking Neural Networks (SNNs) offer an efficient framework for processing event-driven data due to their sparse, spike-based communication, making them ideal for real-time tasks. However, their inability to capture long-range dependencies limits their effectiveness in complex temporal modeling. To address this challenge, we present a **SPLR (SPiking Network for Learning Long-range Relations)**, a novel architecture designed to overcome these limitations. The core contribution of SPLR is the **Spike-Aware HiPPO (SA-HiPPO)** mechanism, which adapts the HiPPO framework for discrete, spike-driven inputs, enabling efficient long-range memory retention in event-driven systems. Additionally, SPLR includes a convolutional layer that integrates state-space dynamics to enhance feature extraction while preserving the efficiency of sparse, asynchronous processing. Together, these innovations enable SPLR to model both short- and long-term dependencies effectively, outperforming prior methods on various event-based datasets. Experimental results demonstrate that SPLR achieves superior performance in tasks requiring fine-grained temporal dynamics and long-range memory, establishing it as a scalable and efficient solution for real-time applications such as event-based vision and sensor fusion in neuromorphic computing.
Supplementary Material: zip
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9155
Loading