A decoder-only foundation model for time-series forecasting

TMLR Paper1785 Authors

03 Nov 2023 (modified: 16 Dec 2023)Withdrawn by AuthorsEveryoneRevisionsBibTeX
Abstract: Motivated by recent advances in large language models for Natural Language Processing (NLP), we design a time-series foundation model for forecasting whose out-of-the-box zero-shot performance on a variety of public datasets comes close to the accuracy of state-of-the-art supervised forecasting models for each individual dataset. Our model is based on pretraining a patched-decoder style attention model on a large time-series corpus, and can work well across different forecasting history lengths, prediction lengths and temporal granularities.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Lei_Li11
Submission Number: 1785
Loading