Abstract: A promising class of generative models maps points from a simple distribution to a complex distribution through an invertible neural network. Likelihood-based training of these models requires restricting their architectures to allow cheap computation of Jacobian determinants. Alternatively, the Jacobian trace can be used if the transformation is specified by an ordinary differential equation. In this paper, we use Hutchinson’s trace estimator to give a scalable unbiased estimate of the log-density. The result is a continuous-time invertible generative model with unbiased density estimation and one-pass sampling, while allowing unrestricted neural network architectures. We demonstrate our approach on high-dimensional density estimation, image generation, and variational inference, achieving the state-of-the-art among exact likelihood methods with efficient sampling.
Keywords: generative models, density estimation, approximate inference, ordinary differential equations
TL;DR: We use continuous time dynamics to define a generative model with exact likelihoods and efficient sampling that is parameterized by unrestricted neural networks.
Code: [![Papers with Code](/images/pwc_icon.svg) 7 community implementations](https://paperswithcode.com/paper/?openreview=rJxgknCcK7)
Data: [BSD](https://paperswithcode.com/dataset/bsd), [CIFAR-10](https://paperswithcode.com/dataset/cifar-10), [Caltech-101](https://paperswithcode.com/dataset/caltech-101), [MNIST](https://paperswithcode.com/dataset/mnist), [Omniglot](https://paperswithcode.com/dataset/omniglot-1), [UCI Machine Learning Repository](https://paperswithcode.com/dataset/uci-machine-learning-repository)
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 5 code implementations](https://www.catalyzex.com/paper/ffjord-free-form-continuous-dynamics-for/code)
15 Replies
Loading