Evaluating High-Order Predictive Distributions in Deep LearningDownload PDF

Published: 20 May 2022, Last Modified: 20 Oct 2024UAI 2022 PosterReaders: Everyone
Keywords: uncertainty, neural networks, bayesian, testbed, joint distributions
TL;DR: Evaluating high-order predictive distributions is computationally challenging. We offer a practical heuristics with insights on synthetic and empirical data.
Abstract: Most work on supervised learning research has focused on marginal predictions. In decision problems, joint predictive distributions are essential for good performance. Previous work has developed methods for assessing low-order predictive distributions with inputs sampled i.i.d. from the testing distribution. With low-dimensional inputs, these methods distinguish agents that effectively estimate uncertainty from those that do not. We establish that the predictive distribution order required for such differentiation increases greatly with input dimension, rendering these methods impractical. To accommodate high-dimensional inputs, we introduce \textit{dyadic sampling}, which focuses on predictive distributions associated with random \textit{pairs} of inputs. We demonstrate that this approach efficiently distinguishes agents in high-dimensional examples involving simple logistic regression as well as complex synthetic and empirical data.
Supplementary Material: zip
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/evaluating-high-order-predictive/code)
5 Replies

Loading