Meta-Learning Contextual Time Series Forecasting with Neural Processes

ICLR 2026 Conference Submission18560 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Meta Learning, Neural Processes, Latent Variable Models, Time Series Forecasting
TL;DR: We propose a novel meta-learning Neural Process architecture that integrates context from multiple related time series to perform robust time series forecasting.
Abstract: Neural Processes (NPs) are a powerful class of meta-learning models that can be applied to time series forecasting by formalizing it as a probabilistic regression problem. However, conventional NPs base their predictions only on observations from a single time series, which limits their ability to leverage varied contextual information. In this paper, we introduce a novel NP architecture that, in the spirit of meta-learning, is designed to incorporate context information from multiple related time series. To this end, our approach treats related time series as conditionally independent context examples of a shared underlying data-generating process corresponding to a specific meta-task. A sequence encoder aggregates a variable number of such context time series into a latent task description, which then conditions a sequence decoder, enabling accurate forecasting of unseen target time series. We evaluate our approach on challenging time series forecasting problems, demonstrating that our architecture performs favorably compared to a range of competitor approaches.
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 18560
Loading