Keywords: Diffusion Models, Time Series Forecasting, Model Predictive Control, Uncertainty Quantification, Reinforcement Learning, Energy Markets
TL;DR: We integrate diffusion-based probabilistic forecasting into Model Predictive Control frameworks to enhance decision-making in uncertain and partially observable environments.
Abstract: We propose Diffusion-Informed Model Predictive Control (D-I MPC), a generic framework for uncertainty‐aware prediction and decision-making in partially observable stochastic systems by integrating diffusion‐based time series forecasting models in Model Predictive Control algorithms. In our approach, a diffusion‐based time series forecasting model is used to probabilistically estimate the evolution of the system’s stochastic components. These forecasts are then incorporated into MPC algorithms to estimate future trajectories and optimize action selection under the uncertainty of the future. We evaluate the framework on the task of energy arbitrage, where a Battery Energy Storage System participates in the day‐ahead electricity market of the New York state. Experimental results indicate that our model‐based approach with a diffusion-based forecaster significantly outperforms both implementations with classical forecasting methods and model‐free reinforcement learning baselines.
Submission Number: 80
Loading