TL;DR: We propose a conformal prediction based method for model uncertainty quantification in continual learning (CPCL) and theoretically prove the asymptotic coverage guarantee of CPCL.
Abstract: Continual learning has attracted increasing research attention in recent years due to its promising experimental results in real-world applications. In this paper, we study the issue of calibration in continual learning which reliably quantifies the uncertainty of model predictions. Conformal prediction (CP) provides a general framework for model calibration, which outputs prediction intervals or sets with a theoretical high coverage guarantee as long as the samples are exchangeable. However, the tasks in continual learning are learned in sequence, which violates the principle that data should be exchangeable. Meanwhile, the model learns the current task with limited or no access to data from previous tasks, which is not conducive to constructing the calibration set. To address these issues, we propose a CP-based method for model uncertainty quantification in continual learning (CPCL), which also reveals the connection between prediction interval length and forgetting. We analyze the oracle prediction interval in continual learning and theoretically prove the asymptotic coverage guarantee of CPCL. Finally, extensive experiments on simulated and real data empirically verify the validity of our proposed method.
Lay Summary: Continual learning has attracted increasing research attention in recent years due to its promising experimental results in real-world applications. However, existing works to date have ignored the issue of calibration in continual learning.
In this paper, we study the issue of calibration in continual learning which reliably quantifies the uncertainty of model predictions. We propose a conformal prediction based method for model uncertainty quantification in continual learning (CPCL). We analyze the oracle prediction interval in continual learning and theoretically prove the asymptotic coverage guarantee of CPCL. Finally, extensive experiments on simulated and real data empirically verify the validity of our proposed method.
Our work reveals the connection between prediction interval length and forgetting.
Primary Area: Theory->Learning Theory
Keywords: Uncertainty, continual learning
Submission Number: 11041
Loading