Keywords: Probabilistic programming, Markov chain Monte Carlo, Bayesian inference
TL;DR: Hamiltonian Monte Carlo can be improved by automatically marginalizing variables via conjugacy
Abstract: Hamiltonian Monte Carlo (HMC) is a powerful algorithm to sample latent variables from Bayesian models. The advent of probabilistic programming languages (PPLs) frees users from writing inference algorithms and lets users focus on modeling. However, many models are difficult for HMC to solve directly, which often require tricks like model reparameterization. We propose to use automatic marginalization as part of the sampling process using HMC in a graphical model extracted from a PPL, which substantially improves sampling from real-world hierarchical models.