TimEE: Towards End-to-end Time Series Classification via In-Context Learning

Published: 01 Mar 2026, Last Modified: 10 Apr 2026ICLR 2026 TSALM Workshop PosterEveryoneRevisionsBibTeXCC BY 4.0
Presentation Attendance: No, we cannot present in-person
Keywords: Time Series Classification, Foundation Models, In-Context Learning
TL;DR: TimEE is a foundation model capable of in-context learning on time series classification tasks.
Abstract: We introduce TimEE, a 2M-parameter foundation model for end-to-end time series classification via in-context learning. Unlike prior works that rely on decoupled feature encoders and task-specific classifiers, TimEE utilizes a unified framework to directly approximate the conditional predictive distribution of a test sample given the training set. Concretely, it enables both temporal reasoning and classification within a single forward pass. Evaluated on 42 binary classification datasets from the UCR Time Series Archive, TimEE outperforms default linear-probing baselines and matches the performance of models up to 60$\times$ larger, while reducing runtime by up to an order of magnitude. Our results suggest that end-to-end trained foundation models are an effective and computationally efficient alternative to the two-stage paradigm for time series classification.
Track: Research Track (max 4 pages)
Submission Number: 111
Loading