Rethinking the Temporal Modeling for Time Series Forecasting with Hybrid Modeling

19 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Time Series Forecasting
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Time series forecasting is a critical task in various domains, including traffic, energy, and weather series forecasting. Recent research has explored the utilization of MLPs, Transformers, and CNNs architectures for time series modeling, delivering promising results. In this work, we take a step further by systematically studying the strengths and limitations of these methods and integrating their capabilities to formulate a unified framework for time series forecasting with a hybrid modeling approach. We introduce UniTS, a simple yet scalable framework for temporal modeling that incorporates multiple feature learning techniques. Moreover, prior research employed different hyperparameter configurations in various temporal modeling approaches, which might causing unfair performance comparisons. For instance, when predicting with the same forecasting horizon, prior approaches might exhibit significant variations in lookback window lengths. In our study, we address this issue by validating and standardizing parameters that can significantly impact performance, ensuring a more equitable comparison of models across diverse datasets. UniTS achieves state-of-the-art performance across various domains, and we conduct extensive experiments to further evaluate its capabilities. Our results are fully reproducible, and the source code for this work is available at https://anonymous.4open.science/r/UniTS-8DA8/README.md.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 1793
Loading