everyone
since 13 Oct 2023">EveryoneRevisionsBibTeX
Uncertainty modeling is crucial in developing robust and reliable models since it enables decision-makers to access the trustworthiness of predictions and make informed choices based on the uncertainty associated with the prediction. A straightforward approach to endow models with the ability to estimate uncertainty involves modeling a probabilistic distribution of the input representations and approximating it by variational inference. However, this method inevitably leads to an issue where uncertainty is underestimated, resulting in overconfident predictions even when dealing with data that contains inherent noise or ambiguity. In response to this challenge, we introduce a novel approach called Class-Context-aware Phantom Uncertainty Modeling. To circumvent the problem of underestimating uncertainty associated with the input data, we shift the focus to infer the distribution of their respective phantoms, which are derived by leveraging class-contextual information. We mitigate the issue of uncertainty underestimation by demonstrating that the estimated uncertainty of the original input data is no less than that of the phantom. We showcase our method's superior robustness and generalization capabilities through experiments involving robust learning tasks such as noisy label learning and cross-domain generalization.