Lag-Llama: Towards Foundation Models for Time Series Forecasting

Published: 01 Nov 2023, Last Modified: 12 Dec 2023R0-FoMo PosterEveryoneRevisionsBibTeX
Keywords: time series, foundational models, probabilistic forecasting, time series forecasting, transformers, decision making, pretrained models
TL;DR: Strong general-purpose univariate probabilistic time-series forecasting model, with power-law analysis to predict the model's scaling behavior
Abstract: Aiming to build foundation models for time-series forecasting and study their scaling behavior, we present here our work-in-progress on Lag-Llama, a general-purpose univariate probabilistic time-series forecasting model trained on a large collection of time-series data. The model shows good zero-shot prediction capabilities on unseen "out-of-distribution" time-series datasets, outperforming supervised baselines. We use smoothly broken power-laws to fit and predict model scaling behavior. The open source code is made available at https://github.com/kashif/pytorch-transformer-ts.
Submission Number: 73
Loading