Keywords: multivariate time series forecasting, tensor product representation, compositional generalization, structured representation learning, relational learning
TL;DR: We propose TS-TPR, a compositional forecasting model using tensor product representations, explicitly capturing dynamic inter-variable relationships in multivariate time series with improved accuracy and interpretability.
Abstract: Real-world multivariate time series exhibit nonstationary inter-variable dependencies, which evolve dynamically due to external environmental shifts. While capturing these intricate dynamics is crucial for accurate forecasting, many existing methods still struggle to explicitly model these complex relationships. This motivates the need for compositional learning, which explicitly separates relational and temporal components and flexibly recombines them. Such a design allows models to adapt to time-varying inter-variable relationships and generalize to unseen patterns. To address this, we introduce TS-TPR, a novel framework that employs tensor product representations for compositional learning. Specifically, context-aware role generation identifies the most salient relationships at each time, while hierarchical filler extraction summarizes the corresponding temporal patterns. By combining these dynamically generated roles and fillers via tensor products, TS-TPR creates an explicit, structured representation that naturally scales to many variables and adapts as dependencies shift. Through experiments on diverse real-world benchmarks, we show that TS-TPR not only outperforms state-of-the-art baselines but also provides interpretable, time-varying insights into inter-series interactions.
Primary Area: learning on time series and dynamical systems
Submission Number: 8872
Loading