Closing the Zero-Shot Gap in Time Series Classification with Synthetic Data and Test-Time Strategies

Published: 01 Mar 2026, Last Modified: 01 Mar 2026ICLR 2026 TSALM Workshop PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Time series classification, foundation models, test-time optimization
Abstract: Developing foundation models for time series classification is of high practical relevance, as such models can serve as universal feature extractors for diverse downstream tasks. Despite the promise of this approach, a substantial performance gap remained between frozen and fine-tuned encoders. In this work, we introduce a methodology that significantly strengthen zero-shot feature extraction for time series. Considering the Mantis architecture as a backbone, we pre-train it entirely on synthetic time series, monitoring its performance in a layer by layer, epoch by epoch fashion. We reveal that the model benefits from data scaling, but this improvement is hidden in one of the intermediate transformer layers. In addition, we show that the performance can be further improved by refining output-token aggregation, self-ensembling and cross-model embedding fusion. Our experiments on the UCR benchmark show that the improved version of Mantis achieves the state-of-the-art zero-shot performance, being competitive to its fine-tuned version.
Track: Research Track (max 4 pages)
Submission Number: 83
Loading