Abstract: With the growing availability of multi-domain time series data, there is an increasing demand for general forecasting models pre-trained on multi-source datasets to support diverse downstream prediction scenarios. Existing time series foundation models primarily focus on scaling up pre-training datasets and model sizes to enhance generalization performance. In this paper, we take a different approach by addressing two critical aspects of general forecasting models: (1) how to derive unified representations from heterogeneous multi-domain time series data, and (2) how to effectively capture domain-specific features to enable adaptive transfer across various downstream scenarios. To address the first aspect, we propose Decomposed Frequency Learning as the pre-training task, which leverages frequency-based masking and reconstruction to decompose coupled semantic information in time series, resulting in unified representations across domains. For the second aspect, we introduce the Time Series Register, which captures domain-specific representations during pre-training and enhances adaptive transferability to downstream tasks. Our model achieves the state-of-the-art forecasting performance on seven real-world benchmarks, demonstrating remarkable few-shot and zero-shot capabilities.
Lay Summary: Time series forecasting plays a crucial role in various domains. However, building specific models for each new task is resource intensive.
In this paper, we developed ROSE, a lightweight model that is pretrained from multi-domain data and fine-tuned with minimal downstream data for fast application. ROSE develops pre-training tasks from a frequency domain perspective, which helps to learn generalized representations.
In addition, it takes into account domain-specific information, allowing the model to achieve faster and better transfer in downstream tasks.
ROSE excels in scenarios with scarce data. Its efficiency and adaptability make it practical for real-world applications where data availability and computational resources are limited.
Primary Area: Deep Learning->Sequential Models, Time series
Keywords: Foundation models; Time series forecasting
Submission Number: 6072
Loading