Does Cross-Domain Pre-Training Truly Help Time-Series Foundation Models?

Published: 06 Mar 2025, Last Modified: 06 Mar 2025ICLR 2025 FM-Wild WorkshopEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Foundation models, Time series, Domain transfer, Pre-training
TL;DR: We argue that cross-domain pre-training of Time-Series Foundation Models, inspired by strategies used in LLMs, is often counterintuitive and warrants closer scrutiny.
Abstract: Inspired by the success of pre-training large language models, recent efforts have explored cross-domain pre-training for time-series foundation models (TSFMs). However, the distinct data generation dynamics and contextual limitations of time-series data challenge the direct transferability of LLM strategies to TSFMs. In this paper, we investigate ***whether cross-domain pre-training truly benefits TSFMs***. Through systematic experiments, we reveal that while cross-domain pre-training can enhance performance in certain domains, it may also cause severe negative transfer in others due to domain disparities in sampling frequencies and evolution patterns. Surprisingly, transfer effects are often counterintuitive: unrelated domains can yield significant gains, whereas related domains may induce degradation. These findings highlight the need for tailored pre-training strategies that address the unique characteristics of time-series data. Our study provides actionable insights to guide the development of more effective TSFMs.
Submission Number: 70
Loading