GENERATIVE TIME SERIES LEARNING WITH TIME-FREQUENCY FUSED ENERGY-BASED MODEL

21 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: time series forecasting; generative model
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Long-term time series forecasting has gained significant attention in recent academic research. However, existing models primarily focus on deterministic point forecasts, neglecting generative long-term probabilistic forecasting and pre-training models across diverse time series analytic tasks, which are essential for uncertainty quantification and computational efficiency. In this paper, we propose a novel encoder-only generative model for long-term probabilistic forecasting and imputation. Our model is an energy-based model, employing a time-frequency block to construct an unnormalized probability density function over temporal paths. The time-frequency block consists of two key components, i.e., a residual dilated convolutional network to increase the receptive fields of raw time series, and a time and frequency features extracting network to integrate both local and global patterns. Our design enables the prediction of long-term time series in a single forward run using Langevin MCMC, which drastically improves the efficiency and accuracy of long-term forecasting. Moreover, our model naturally serves as a general framework for forecasting at varying prediction lengths and imputing missing data points with one pre-trained model, saving both time and resources. Experiments demonstrate that our model achieves competitive results in both forecasting and imputation tasks across a diverse range of public datasets, highlighting its potential as a promising approach for a unified time series forecasting model capable of handling multiple tasks.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3526
Loading