Predictive Uncertainty through QuantizationDownload PDF

27 Sept 2018 (modified: 05 May 2023)ICLR 2019 Conference Blind SubmissionReaders: Everyone
Abstract: High-risk domains require reliable confidence estimates from predictive models. Deep latent variable models provide these, but suffer from the rigid variational distributions used for tractable inference, which err on the side of overconfidence. We propose Stochastic Quantized Activation Distributions (SQUAD), which imposes a flexible yet tractable distribution over discretized latent variables. The proposed method is scalable, self-normalizing and sample efficient. We demonstrate that the model fully utilizes the flexible distribution, learns interesting non-linearities, and provides predictive uncertainty of competitive quality.
Keywords: variational inference, information bottleneck, bayesian deep learning, latent variable models, amortized variational inference, uncertainty, learning non-linearities
TL;DR: A novel tractable and flexible variational distribution through quantization of latent variables, applied to the deep variational information bottleneck objective for improved uncertainty.
7 Replies

Loading