Unveiling and Manipulating Concepts in Time Series Foundation Models

Published: 10 Oct 2024, Last Modified: 26 Nov 2024NeurIPS 2024 TSALM WorkshopEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Time Series Foundation Models, Model Inspection, Synthetic Data
TL;DR: We explore and steer time series foundation models by identifying and manipulating learned concepts, showing their potential for controlled predictions using synthetic data.
Abstract: Time series foundation models promise to be powerful tools for a wide range of applications. However, little is known about the concepts that these models learn and how can we manipulate them in the latent space. Our study bridges these gaps by identifying concepts learned by these models, localizing them to specific parts of the model, and steering model predictions along these conceptual directions, using synthetic time series data. Our results show that MOMENT, a state-of-the-art foundation model, can discern distinct time series patterns, and that this ability peaks in the middle layers of the network. Moreover, we show that model outputs can be steered using insights from its activations (e.g., by introducing periodic trends to initially constant signals through intervention during inference). Our findings underscore the importance of synthetic data in studying and steering time series foundation models and intervening throughout the whole model (using steering matrices), instead of a single layer.
Submission Number: 55
Loading