Beyond Pinball Loss: Quantile Methods for Calibrated Uncertainty QuantificationDownload PDF

21 May 2021, 20:43 (modified: 26 Oct 2021, 19:33)NeurIPS 2021 PosterReaders: Everyone
Keywords: uncertainty quantification, calibration, pinball loss, quantile regression
TL;DR: Despite the benefits of using quantiles for predictive uncertainty quantification, the pinball loss (the standard method in learning quantiles) has many pitfalls, and we propose novel methods to learn calibrated quantiles.
Abstract: Among the many ways of quantifying uncertainty in a regression setting, specifying the full quantile function is attractive, as quantiles are amenable to interpretation and evaluation. A model that predicts the true conditional quantiles for each input, at all quantile levels, presents a correct and efficient representation of the underlying uncertainty. To achieve this, many current quantile-based methods focus on optimizing the pinball loss. However, this loss restricts the scope of applicable regression models, limits the ability to target many desirable properties (e.g. calibration, sharpness, centered intervals), and may produce poor conditional quantiles. In this work, we develop new quantile methods that address these shortcomings. In particular, we propose methods that can apply to any class of regression model, select an explicit balance between calibration and sharpness, optimize for calibration of centered intervals, and produce more accurate conditional quantiles. We provide a thorough experimental evaluation of our methods, which includes a high dimensional uncertainty quantification task in nuclear fusion.
Supplementary Material: pdf
Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
13 Replies