Abstract: The current deep learning paradigm is generally based on two main assumptions that are not met in many real-world applications: (i) all the data is jointly available for training (allowing for IID training); and (ii) at inference time, we only have data belonging to the classes seen during training (closed-world assumption). In this paper, we study the more realistic scenario, where we have to learn from a non-stationary data stream and in addition we should assess the certainty of the predictions for application in open-world settings. Therefore, we endow a continual learning method with the ability to quantify uncertainty, thus improving its reliability and robustness. To this end, Evidential Deep Learning is integrated into a continual learning framework to efficiently perform continual out-of-distribution (OOD) data detection as the model increases its knowledge. The new approach has been validated on three public datasets and in several continual learning settings, clearly outperforming the existing state-of-the-art methods.
Loading