Sundial: A Family of Highly Capable Time Series Foundation Models

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 oralEveryoneRevisionsBibTeXCC BY-NC-ND 4.0
TL;DR: We introduce Sundial, a family of native, flexible, and scalable time series foundation models pre-trained on a trillion time points.
Abstract: We introduce Sundial, a family of native, flexible, and scalable time series foundation models. To predict the next-patch's distribution, we propose a TimeFlow Loss based on flow-matching, which facilitates native pre-training of Transformers on continuous-valued time series without discrete tokenization. Conditioned on arbitrary-length time series, our models are pre-trained without specifying any prior distribution and can generate multiple probable predictions, achieving more flexibility in representation learning than using parametric densities. Towards time series foundation models, we leverage minimal but crucial adaptations of Transformers and curate TimeBench with one trillion time points, comprising mostly real-world datasets and synthetic data. By mitigating mode collapse via TimeFlow Loss, we pre-train a family of Sundial models on TimeBench, which achieve unprecedented model capacity and generalization performance. In addition to excellent scalability, Sundial achieves state-of-the-art results on both point and probabilistic forecasting benchmarks with a just-in-time inference speed, i.e., making zero-shot predictions within a few milliseconds. We believe that Sundial's pioneering generative forecasting capability can improve model reliability in real-world decision-making. Code is available at: https://github.com/thuml/Sundial.
Lay Summary: Predicting the future from past observations (time series forecasting) is crucial for many real-world applications. However, traditional methods often struggle because the variations of time series are complex, and the future value can be nondeterministic. While powerful AI models exist, they typically need massive amounts of specific training data and tough tuning. To overcome these limitations, we developed a new family of powerful AI tools, called time series foundation models. These models are based on generative AI, the same technology behind creating realistic images or text. Crucially, they were pre-trained on an enormous dataset: one trillion real-world time points from diverse domains. This extensive pre-training allows them to deeply understand complex patterns in time series. Our model significantly outperforms previous methods. It generates predictions that are not only more accurate but also present a wider range of possible future outcomes, providing better insights into uncertainty. This versatility allows it to excel at both specific point forecasters and probabilistic forecasters. The best part? We released this model as a ready-to-use forecaster. It requires no training and delivers top-tier, state-of-the-art results on well-established forecasting benchmarks, making reliable predictions within milliseconds.
Link To Code: https://github.com/thuml/Sundial
Primary Area: Deep Learning->Foundation Models
Keywords: Time Series, Foundation Models
Submission Number: 2877
Loading