Energy-Inspired Models: Learning with Sampler-Induced DistributionsDownload PDF

John D Lawson, George Tucker, Bo Dai, Rajesh Ranganath

06 Sept 2019 (modified: 05 May 2023)NeurIPS 2019Readers: Everyone
Abstract: Extending models with auxiliary latent variables is a powerful technique to increase model expressivity. Recent work has drawn a connection between multi-sample variational lower bounds and auxiliary variable variational inference. We expand on this connection and show that many of the recent developments in variational bounds can be viewed as specific choices in auxiliary variable variational inference. This view simplifies derivations and reveals implicit, suboptimal choices in existing lower bounds. Motivated by the success of enriching the variational family with auxiliary latent variables, we apply the same techniques to the generative model. This yields a tractable class of models which combine properties of normalized and energy-based models. We describe and evaluate two instantiations of such models, the first based on self-normalized importance sampling (SNIS) and the second based on Hamiltonian importance sampling (HIS). Both models outperform the recently proposed Learned Accept/Reject Sampling algorithm. Finally, the generative process for the SNIS model provides new insights on ranking Noise Contrastive Estimation and Contrastive Predictive Coding.
CMT Num: 4594
Code Link: https://sites.google.com/view/energy-inspired-models
0 Replies

Loading