TimeDiT: General-purpose Diffusion Transformers for Time Series Foundation Model

Published: 03 Jul 2024, Last Modified: 17 Jul 2024ICML 2024 FM-Wild Workshop PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Foundation model; Time series; Generative model; Diffusion model
TL;DR: We introduced TimeDiT, a pioneering approach to creating a versatile and robust foundation model for various time series tasks via diffusion based transformers.
Abstract: Time series modeling is critical for many real-world applications, but most existing approaches are task-specific. With the unique characteristics such as missing values, irregular sampling, multi-resolution and complex temporal dependencies, it is challenging to develop general foundation models for time series. In this paper, we introduce the Time Series Diffusion Transformer (TimeDiT) equipped with three distinct masking schemes designed to facilitate a uniform training and inference pipeline across various time series tasks. TimeDiT leverages the transformer architecture for capturing temporal dependencies and employs diffusion processes for generating high-quality candidate samples without stringent assumptions on the target distribution. Extensive experiments conducted on different datasets encompassing tasks such as forecasting, imputation, and anomaly detection demonstrate the model’s effectiveness. Both in-domain and zero-shot testing scenarios confirm the potential of our model to serve as a robust foundation model for multiple time series applications.
Submission Number: 43
Loading