Variational Message Passing with Structured Inference Networks

Wu Lin, Mohammad Emtiyaz Khan, Nicolas Hubacher

Feb 15, 2018 (modified: Feb 15, 2018) ICLR 2018 Conference Blind Submission readers: everyone Show Bibtex
  • Abstract: We propose a variational message-passing algorithm for a class of models that combine deep models with probabilistic graphical models. Our algorithm is a natural-gradient algorithm whose messages automatically reduce to stochastic-gradients for the deep components of the model. Using a special-structure inference network, our algorithm exploits the structural properties of the model to gain computational efficiency while retaining the simplicity and generality of deep-learning algorithms. By combining the strength of two different types of inference procedures, our approach offers a framework that simultaneously enables structured, amortized, and natural-gradient inference for complex models.
  • TL;DR: We propose a message-passing algorithm for models that contain both the deep model and probabilistic graphical model.
  • Keywords: Variational Inference, Variational Message Passing, Variational Auto-Encoder, Graphical Models, Structured Models

Loading