Keywords: Time-Series Forecasting, Meta-Learning, Zero-Shot Forecasting, Prior-Fitted Networks, Context-Informed
TL;DR: Introduces a meta-learning framework for zero-shot time-series forecasting using related series as contextual conditioning.
Abstract: Accurate multivariate time-series forecasting is essential for decision making in domains such as retail, yet most neural forecasting models remain task-specific and perform poorly in data-scarce or zero-shot settings. Prior-Fitted Networks (PFNs) provide a principled Bayesian meta-learning framework, but existing PFN-based approaches to time-series forecasting make limited use of context, restricting their ability to condition on structurally related examples. We propose a \textit{context-centric framework} for zero-shot multivariate forecasting and introduce In-Context TimePFN, a Transformer-based PFN trained exclusively on synthetic tasks with explicit \textit{context–query} structure. By treating forecasting as in-context Bayesian inference over structured temporal contexts, our approach performs probabilistic prediction without task-specific fine-tuning. Experiments on four real-world benchmarks show that In-Context TimePFN achieves competitive or superior zero-shot performance compared to standard baselines and TimePFN. Controlled ablations further demonstrate that performance is highly sensitive to the quality and alignment of context, identifying structured context as a key enabler of effective zero-shot time-series forecasting with PFNs.
Track: Research Track (max 4 pages)
Submission Number: 68
Loading