TimeDART: A Diffusion Autoregressive Transformer for Self-Supervised Time Series Representation

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Self-supervised learning has garnered increasing attention in time series analysis for benefiting various downstream tasks and reducing reliance on labeled data. Despite its effectiveness, existing methods often struggle to comprehensively capture both long-term dynamic evolution and subtle local patterns in a unified manner. In this work, we propose \textbf{TimeDART}, a novel self-supervised time series pre-training framework that unifies two powerful generative paradigms to learn more transferable representations. Specifically, we first employ a causal Transformer encoder, accompanied by a patch-based embedding strategy, to model the evolving trends from left to right. Building on this global modeling, we further introduce a denoising diffusion process to capture fine-grained local patterns through forward diffusion and reverse denoising. Finally, we optimize the model in an autoregressive manner. As a result, TimeDART effectively accounts for both global and local sequence features in a coherent way.We conduct extensive experiments on public datasets for time series forecasting and classification. The experimental results demonstrate that TimeDART consistently outperforms previous compared methods, validating the effectiveness of our approach.Our code is available at \url{https://github.com/Melmaphother/TimeDART}.
Lay Summary: TimeDART: Helping Computers Better Understand Time-Series Data Many crucial types of data, like stock prices, weather patterns, or medical readings, unfold over time. Understanding these time-series data deeply is key to predicting future trends or identifying anomalies. However, current computer methods often struggle to simultaneously capture both the long-term overall changes and the subtle local details within these complex datasets. To address this challenge, we developed TimeDART, a new machine learning framework. TimeDART cleverly unifies two powerful generative techniques. It first models evolving trends like reading a story from left to right, capturing the global flow. Then, through a "denoising" process, it meticulously identifies fine-grained local patterns that might otherwise be overlooked. This dual approach allows TimeDART to learn more comprehensive and transferable representations of time series. Our extensive experiments show that TimeDART consistently outperforms previous methods on various tasks, including forecasting future values and classifying different types of time series. This means TimeDART can help computers analyze time-based data more accurately, leading to more reliable applications in fields like finance, healthcare, and environmental monitoring.
Link To Code: https://github.com/Melmaphother/TimeDART
Primary Area: General Machine Learning->Sequential, Network, and Time Series Modeling
Keywords: time series, self-supervised learning, diffusion model
Submission Number: 2826
Loading