Leto: Modeling Multivariate Time Series with Memorizing at Test Time

Published: 10 Jun 2025, Last Modified: 21 Jul 2025PUT at ICML 2025 OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Multivariate Time series, Time Series Forecasting, Test Time Memorization, Transformers, Recurrent neural networks
Abstract: Modeling multivariate time series remains a core challenge due to complex temporal and cross-variate dependencies. While sequence models like Transformers, CNNs, and RNNs have been adapted from NLP and vision tasks, they often struggle with multivariate structure, long-range dependencies, or error propagation. We introduce Leto, a 2D memory module that leverages temporal inductive bias while preserving variate permutation equivariance. By combining in-context memory with cross-variate attention, Leto effectively captures temporal patterns and inter-variate signals. Experiments across diverse benchmarks—forecasting, classification, and anomaly detection—demonstrate its strong performance.
Submission Number: 33
Loading