Keywords: time series, SNN, resonate and fire neurons, SSM, activity recognition
TL;DR: Energy efficient second order SSM for learning very long sequences
Abstract: Spiking neural networks have attracted increasing attention for their energy efficiency, multiplication-free computation, and sparse event-based processing. In parallel, state space models have emerged as scalable alternatives to transformers for long-range sequence modeling by avoiding quadratic dependence on sequence length. We propose SHaRe-SSM (Spiking Harmonic Resonate-and-Fire State Space Model), a second-order spiking SSM for classification and regression on ultra-long sequences. SHaRe-SSM outperforms transformers and first-order SSMs on average while eliminating matrix multiplications, making it highly suitable for resource-constrained applications. To ensure fast computation over tens of thousands of time steps, we leverage a parallel scan formulation of the underlying dynamical system. Furthermore, we introduce a kernel-based spiking regressor, enabling accurate modeling of dependencies in sequences up to 50k steps. Our results demonstrate that SHaRe-SSM achieves superior long-range modeling capability with energy efficiency ($52.1\times$ less than ANN-based second-order SSM), positioning it as a strong candidate for resource-constrained devices such as wearables.
Submission Number: 89
Loading