Keywords: continuous normalizing flows, normalizing flows, generative modeling, boltzmann generators
TL;DR: In this work, we present a special class of continuous normalizing which admit efficient, exact-likelihood integrators for theoretically-sound importance sampling.
Abstract: Approximations in computing model likelihoods with continuous normalizing flows (CNFs) hinder the use of these models for importance sampling of Boltzmann distributions, where exact likelihoods are required. In this work, we present \textit{Verlet flows}, a class of CNFs on an augmented state-space inspired by symplectic integrators from Hamiltonian dynamics. When used with carefully constructed \emph{Taylor-Verlet integrators}, Verlet flows provide exact-likelihood generative models which generalize coupled flow architectures from a non-continuous setting while imposing minimal expressivity constraints. On experiments over toy densities, we demonstrate that the variance of the commonly used Hutchinson trace estimator is unsuitable for importance sampling, whereas Verlet flows perform comparably to full autograd trace computations while being significantly faster.
Submission Number: 78
Loading