Efficient Inference Amortization in Graphical Models using Structured Continuous Conditional Normalizing FlowsDownload PDF

16 Oct 2019 (modified: 05 May 2023)AABI 2019Readers: Everyone
Keywords: amortization, bayesian inference, normalizing flows, graphical models, deconvolution, probabilistic programming
TL;DR: We introduce a more efficient neural architecture for amortized inference, which combines continuous and conditional normalizing flows using a principled choice of sparsity structure.
Abstract: We introduce a more efficient neural architecture for amortized inference, which combines continuous and conditional normalizing flows using a principled choice of structure. Our gradient flow derives its sparsity pattern from the minimally faithful inverse of its underlying graphical model. We find that this factorization reduces the necessary numbers both of parameters in the neural network and of adaptive integration steps in the ODE solver. Consequently, the throughput at training time and inference time is increased, without decreasing performance in comparison to unconstrained flows. By expressing the structural inversion and the flow construction as compilation passes of a probabilistic programming language, we demonstrate their applicability to the stochastic inversion of realistic models such as convolutional neural networks (CNN).
0 Replies

Loading