Edge of Stability Echo State Network

Published: 01 Jan 2025, Last Modified: 15 May 2025IEEE Trans. Neural Networks Learn. Syst. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Echo state networks (ESNs) are time series processing models working under the echo state property (ESP) principle. The ESP is a notion of stability that imposes an asymptotic fading of the memory of the input. On the other hand, the resulting inherent architectural bias of ESNs may lead to an excessive loss of information, which in turn harms the performance in certain tasks with long short-term memory requirements. To bring together the fading memory property and the ability to retain as much memory as possible, in this article, we introduce a new ESN architecture called the Edge of Stability ESN (ES2N). The introduced ES2N model is based on defining the reservoir layer as a convex combination of a nonlinear reservoir (as in the standard ESN), and a linear reservoir that implements an orthogonal transformation. In virtue of a thorough mathematical analysis, we prove that the whole eigenspectrum of the Jacobian of the ES2N map can be contained in an annular neighborhood of a complex circle of controllable radius. This property is exploited to tune the ES2N’s dynamics close to the edge-of-chaos regime by design. Remarkably, our experimental analysis shows that ES2N model can reach the theoretical maximum short-term memory capacity (MC). At the same time, in comparison to conventional reservoir approaches, ES2N is shown to offer an excellent trade-off between memory and nonlinearity, as well as a significant improvement of performance in autoregressive nonlinear modeling and real-world time series modeling.
Loading