A Novel Regression Loss for Non-Parametric Uncertainty OptimizationDownload PDF

Published: 21 Dec 2020, Last Modified: 20 Oct 2024AABI2020Readers: Everyone
Keywords: regression, uncertainty quantification, uncertainty evaluation, dropout
TL;DR: We propose a novel regression loss for dropout networks that improves on the state-of-the-art of uncertainty quantification.
Abstract: Quantification of uncertainty is one of the most promising approaches to establish safe machine learning. Despite its importance, it is far from being generally solved, especially for neural networks. One of the most commonly used approaches so far is Monte Carlo dropout, which is computationally cheap and easy to apply in practice. However, it can underestimate the uncertainty. We propose a new objective, referred to as second-moment loss (SML), to address this issue. While the full network is encouraged to model the mean, the dropout networks are explicitly used to optimize the model variance. We intensively study the performance of the new objective on various UCI regression datasets. Comparing to the state-of-the-art of deep ensembles, SML leads to comparable prediction accuracies and uncertainty estimates while only requiring a single model. Under distribution shift, we observe moderate improvements. As a side result, we introduce an intuitive Wasserstein distance-based uncertainty measure that is non-saturating and thus allows to resolve quality differences between any two uncertainty estimates.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/a-novel-regression-loss-for-non-parametric/code)
1 Reply

Loading