SKOLR: Structured Koopman Operator Linear RNN for Time-Series Forecasting

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Koopman operator theory provides a framework for nonlinear dynamical system analysis and time-series forecasting by mapping dynamics to a space of real-valued measurement functions, enabling a linear operator representation. Despite the advantage of linearity, the operator is generally infinite-dimensional. Therefore, the objective is to learn measurement functions that yield a tractable finite-dimensional Koopman operator approximation. In this work, we establish a connection between Koopman operator approximation and linear Recurrent Neural Networks (RNNs), which have recently demonstrated remarkable success in sequence modeling. We show that by considering an extended state consisting of lagged observations, we can establish an equivalence between a structured Koopman operator and linear RNN updates. Building on this connection, we present SKOLR, which integrates a learnable spectral decomposition of the input signal with a multilayer perceptron (MLP) as the measurement functions and implements a structured Koopman operator via a highly parallel linear RNN stack. Numerical experiments on various forecasting benchmarks and dynamical systems show that this streamlined, Koopman-theory-based design delivers exceptional performance. Our code is available at: https://github.com/networkslab/SKOLR.
Lay Summary: From tracking weather patterns to forecasting stock prices, predicting how things change over time — known as time-series forecasting — is a tough challenge, especially when those changes follow complex, nonlinear rules. In this work, we introduce a new method that bridges an existing mathematical theory with practical machine learning techniques to tackle this problem efficiently. The proposed approach is inspired by Koopman operator theory, a powerful mathematical framework providing the foundation of modeling complex non-linear systems with linear transformations. We establish a theoretical connection between the approximation of Koopman operators and a lightweight type of neural networks called linear Recurrent Neural Networks (linear RNNs). Leveraging this theoretical connection, we develop a novel linear RNN approach called Structured Koopman Operator Linear RNNs (SKOLR). SKOLR stands out by blending mathematical foundations with efficient computation. Tests across various datasets — from physics simulations to real-world time series benchmarks — show that it performs impressively with low computational burden. This work opens new possibilities for developing efficient and theoretically grounded models in time-series forecasting and dynamical system analysis, potentially influencing future research directions in these fields.
Primary Area: Deep Learning->Sequential Models, Time series
Keywords: Koopman Operator Theory, Linear RNN, Time-Series Forecasting
Submission Number: 14482
Loading