Keywords: Dynamical Systems, Koopman Operator, Forecasting, Attention Free Transformer
TL;DR: This paper enhances Koopman predictors with a lightweight latent-memory block and dynamic re-encoding, yielding fast, compact, and robust long-term forecasts across diverse dynamical systems.
Abstract: Learning Koopman operators with autoencoders enables linear prediction in a latent space, but long-horizon rollouts often drift off the learned manifold, leading to phase and amplitude errors on systems with switching, continuous spectra, or strong transients. We introduce two complementary components that make Koopman predictors substantially more robust. First, we add an \emph{attention-free latent memory} (AFT) block that aggregates a short window of past latents to produce a corrective residual before each Koopman update. Unlike multi-head attention, AFT operates in linear time with nearly identical parameter count to the baseline, yet captures the local temporal context needed to suppress error divergence. Second, we propose \emph{dynamic re-encoding}: lightweight, online change-point triggers (EWMA, CUSUM, and sequential two-sample tests) that detect latent drift and project predictions back onto the autoencoder manifold. Across three benchmark systems—Duffing oscillator, Repressilator, IRMA—our model consistently reduces error accumulation compared to a Koopman autoencoder and matched-capacity multi-head attention. We also compare against GRU and Transformer autoencoders, evaluated both from initial conditions and with a 50-step context, and find that Koopman+AFT (with optional re-encoding) attains markedly lower long-horizon error while maintaining substantially lower inference latency. We report improvements over horizons up to 1000 steps, together with ablations over trigger policies. The resulting predictors are fast, compact, and geometry-preserving, providing a practical path to long-term forecasting with Koopman methods.
Primary Area: learning on time series and dynamical systems
Submission Number: 20994
Loading