MMGPT4LF: Leveraging an optimized pre-trained GPT-2 model with multi-modal cross-attention for load forecasting

Mingyang Gao, Suyang Zhou, Wei Gu, Zhi Wu, Haiquan Liu, Aihua Zhou, Xinliang Wang

Published: 01 Aug 2025, Last Modified: 25 Nov 2025Applied EnergyEveryoneRevisionsCC BY-SA 4.0
Abstract: Highlights•Propose a novel multi-modal fusion framework for load forecasting based on a pre-trained GPT-2 model.•A cross-attention mechanism is designed to align and fuse high-dimensional representations from textual descriptions and time series data.•Linear transformation layers are incorporated at both the input and output stages of the GPT-2 model.•Extensive case studies are conducted on two open-source datasets against nine state-of-the-art forecasting methods, and our method achieves the highest accuracy.
Loading