Harnessing Untrained Dynamics: A Reservoir Computing Approach to State-Space Models

ICLR 2026 Conference Submission19773 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: echo state networks, reservoir computing, recurrent neural networks
Abstract: We introduce the Reservoir State Space Model (RSSM), a novel neural architecture that integrates the structured dynamics of State Space Models (SSMs) with the efficiency of reservoir computing to address long-term dependencies in sequence modeling. Leveraging the linear structure of SSMs, RSSMs implement efficient convolutional operations that maintain a latent internal state, akin to Recurrent Neural Networks (RNNs), while enabling fast and parallelizable computation. We conduct a stability analysis of the underlying SSMs to extend the memory capacity of the model, ensuring rich and expressive hidden representations. A key innovation of RSSM is its use of untrained, structured convolutional dynamics as a fixed reservoir, with learning confined to a lightweight feed-forward readout layer. This design drastically reduces training complexity and computational overhead, making RSSMs well-suited for low-resource or real-time applications. Empirical evaluations on standard sequence modeling benchmarks demonstrate that RSSMs achieve competitive accuracy while offering significant efficiency gains compared to traditional trainable architectures. Our results establish RSSMs as a new class of sequence models that combines the strengths of structured, fixed dynamics with the flexibility of learned representations, offering a compelling trade-off between performance and efficiency.
Primary Area: learning on time series and dynamical systems
Submission Number: 19773
Loading