Keywords: time series forecasting, retrieval augmented generation, time series foundation model
Abstract: Time series forecasting plays a crucial role in data mining, driving rapid advancements across numerous industries.
With the emergence of large models, time series foundation models (TSFMs) have exhibited remarkable generalization capabilities, such as zero-shot learning, through large-scale pre-training.
Meanwhile, Retrieval-Augmented Generation (RAG) methods are widely employed to enhance the performance of foundation models on unseen data, allowing models to access to external knowledge.
In this paper, we introduce **TimeRAF**, a **R**etrieval-**A**ugmented **F**orecasting model that enhance zero-shot time series forecasting through retrieval-augmented techniques.
We develop customized time series knowledge bases that are tailored to the specific forecasting tasks.
TimeRAF employs an end-to-end learnable retriever to extract valuable information from the knowledge base.
Additionally, we propose Channel Prompting for knowledge integration, which effectively extracts relevant information from the retrieved knowledge along the channel dimension.
Extensive experiments demonstrate the effectiveness of our model, showing significant improvement across various domains and datasets.
Primary Area: learning on time series and dynamical systems
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 7472
Loading