Approximate Probabilistic Inference with Composed FlowsDownload PDF

Published: 06 Jul 2022, Last Modified: 05 May 2023NeurIPS 2020 Deep Inverse Workshop PosterReaders: Everyone
Keywords: probabilistic inference, normalizing flow, inverse problem
TL;DR: An algorithm for performing approximate probabilistic inference on the joint distribution given by a normalizing flow with applications to inverse problems.
Abstract: We study the problem of probabilistic inference on the joint distribution defined by a normalizing flow model. Given a pre-trained flow model $p(\boldsymbol{x})$, we wish to estimate $p(\boldsymbol{x}_2 \mid \boldsymbol{x}_1)$ for some partitioning of the variables $\boldsymbol{x} = (\boldsymbol{x}_1, \boldsymbol{x}_2)$. We first show that this task is computationally hard for a large class of flow models. Motivated by this, we propose a framework for \textit{approximate} probabilistic inference. Specifically, our method trains a new flow model with the property that its composition with the given model approximates the target conditional distribution. We describe how we can train this new model using variational inference and handle conditioning under arbitrary differentiable transformations. Experimentally, our approach outperforms Langevin Dynamics in terms of sample quality, while requiring much fewer parameters and training time compared to regular variational inference. We further validate the flexibility of our method on a variety of inference tasks with applications to inverse problems.
Conference Poster: pdf
0 Replies

Loading