Nonparametric posterior normalizing flows

Published: 19 Jun 2023, Last Modified: 28 Jul 20231st SPIGM @ ICML PosterEveryoneRevisionsBibTeX
Keywords: normalizing flows, Bayesian, nonparametric learning
Abstract: Normalizing flows allow us to describe complex probability distributions, and can be used to perform flexible maximum likelihood density estimation (Dinh et al., 2014). Such maximum likelihood density estimation is likely to overfit, particularly if the number of observations is small. Traditional Bayesian approaches offer the prospect of capturing posterior uncertainty, but come at high computational cost and do not provide an intuitive way of incorporating prior information. A nonparametric learning approach (Lyddon et al., 2018) allows us to combine observed data with priors on the space of observations. We present a scalable approximate inference algorithm for nonparametric posterior normalizing flows, and show that the resulting distributions can yield improved generalization and uncertainty quantification.
Submission Number: 91
Loading