Towards Efficient Foundation Model: A Novel Time Series Embedding

Published: 28 Nov 2025, Last Modified: 30 Nov 2025NeurIPS 2025 Workshop MLxOREveryoneRevisionsBibTeXCC BY 4.0
Keywords: Time Series Foundation Models, Time Series Embeddings, Time Series Classification
TL;DR: We introduce a universal image-based time series embedding that rivals TimesFM on synthetic benchmarks while being far more resource-efficient, offering a lightweight alternative encoder for foundation models.
Abstract: Time Series Foundation Model (TSFM) learns appropriate embeddings from pre-training data and uses them to embed the input time series for in-context learning to produce forecast. TSFM requires rich pre-training dataset and large computational resources to learn effective embeddings. In contrast, traditional time series modeling paradigm generates forecast for a given time series by fitting one of many pre-determined models and using the best of them to produce a forecast. Though resource efficient, it suffers from inability to utilize the pre-training data along with the challenges involved in the best model selection. In this work, we are motivated to bring the best of both worlds together to enable resource efficient TSFM approach. Towards that, we introduce a novel embedding of time series of any length and scale by mapping them to unit square (i.e $[0, 1]^2$) or equivalently a 2D image. To evaluate its efficacy compared to embedding from a TSFM, we consider the task of model identification or classification for dataset where each time series is generated from one of many pre-determined model class. We find that the performance of the proposed embeddings is comparable to that of embeddings from a pre-trained TSFM, but at a fraction of resource requirement. This suggests an alternative architectural possibility for a compute efficient TSFM.
Submission Number: 108
Loading