Keywords: conformal prediction; uncertainty quantification; time-series forecasting
Abstract: Quantifying uncertainty in time series forecasting is particularly demanding because sequential data exhibit temporal dependence and are prone to distributional changes. Conformal inference has emerged as a powerful uncertainty quantification approach for evaluating the reliability of predictive models through the construction of prediction sets. Recent advances have introduced online conformal methods that adaptively adjust prediction thresholds through feedback mechanisms. However, the existing feedback mechanism typically relies solely on miscoverage indicators (actual feedback) — whether the true label falls within the interval at each time step — while overlooking the empirical prediction threshold (estimated feedback) that is derived from the oracle conformal method. In this paper, we propose $\textit{Dynamic Dual-feedback Conformal Inference}$ (DDCI), which incorporates a dual-feedback mechanism consisting of $\textit{actual feedback}$ and $\textit{estimated feedback}$. The former drives the primary adjustment of the intervals based on true observations, while the latter dampens excessive expansions or contractions by leveraging empirical thresholds from conformal inference during updates. By balancing these two signals, DDCI achieves more stable and narrower prediction intervals in sequential settings while preserving the coverage validity.
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Submission Number: 18277
Loading