Keywords: Reservoir Computing, Echo State Networks, Persistent Cohomology, Circular Coordinates, Markov Transition Modeling, Topological Data Analysis, Delay Embedding, Spectral Scaling, Echo-State Property, Chaotic Time Series Forecasting
TL;DR: Learn the ESN reservoir itself from a single trajectory by fusing persistent-loop topology with short-horizon Markov flow; build W as rotation blocks + lifted transitions, scale it for ESP—no BPTT needed.
Abstract: We study whether embedding global topology and local transport into a fixed reservoir can improve phase tracking and prediction. From a single delay‑embedded trajectory, we build a recurrent operator in two parts: (i) long‑lived $H_1$ classes from persistent cohomology are converted to circular coordinates whose average phase velocities instantiate stable $2\times2$ rotation blocks, and (ii) short‑horizon transition counts over a coarse partition define a Markov model whose action is lifted back to neuron space through sparse, stochastic pooling and lifting maps. A convex blend of these topological and flow components is scaled by power iteration to a preset operator‑norm bound, yielding a leaky ESN with a straightforward echo‑state guarantee; only a ridge‑regularized linear readout is trained. The resulting reservoir is fixed, interpretable, and analyzable: its internal oscillators reflect the attractor’s dominant loops, while its couplings align with observed local transport. In experiments on chaotic systems and real‑world series, the method is data‑efficient and maintains the computational profile of standard ESNs, while delivering improved phase tracking and competitive—often superior—multistep forecasts relative to tuned random reservoirs of the same size. Overall, the framework offers a principled alternative to sampling‑based wiring by learning the reservoir once from data.
Supplementary Material: zip
Primary Area: learning on time series and dynamical systems
Submission Number: 881
Loading