Variational Message Passing with Structured Inference Networks


Nov 03, 2017 (modified: Nov 03, 2017) ICLR 2018 Conference Blind Submission readers: everyone Show Bibtex
  • Abstract: Powerful models require powerful algorithms to be effective on real-world problems. We propose such an algorithm for a class of models that combine deep models with probabilistic graphical models. Our algorithm is a natural-gradient message-passing algorithms whose messages automatically reduce to stochastic-gradients for the deep components of the model. Using a special-structure inference network, our algorithm exploits the structural properties of the model to gain computational efficiency while retaining the simplicity and generality of deep-learning algorithms. By combining the strength of two different types of inference procedures, our approach offers a framework that simultaneously enables structured, amortized, and natural- gradient inference for complex models.
  • TL;DR: We propose a message-passing algorithm for models that contain both the deep model and probabilistic graphical model.
  • Keywords: Variational Inference, Variational Message Passing, Variational Auto-Encoder, Graphical Models, Structured Models