Multimodal base distributions for continuous-time normalising flows

Published: 31 Oct 2023, Last Modified: 02 Nov 2023DLDE III PosterEveryoneRevisionsBibTeX
Keywords: Continuous normalising flow, neural ordinary differential equation, out-of-distribution likelihoods
TL;DR: We show evidence in support of the utility of a multimodal base distribution (specifically a GMM) over the standard Gaussian in continuous-time normalising flows.
Abstract: We investigate the utility of a multimodal base distribution in continuous-time normalising flows. Multimodality is incorporated through a Gaussian mixture model (GMM) centred at the empirical means of a target distribution's modes. In- and out-of-distribution likelihoods are reported for flows trained with a unimodal and multimodal base distribution. Our results show that the GMM base distribution leads to performance that is comparable to a standard (unimodal) Gaussian distribution for in-distribution likelihoods, but provides the ability to sample from a specific mode in the target distribution, yields generated samples of improved quality, and gives more reliable out-of-distribution likelihoods for low-dimensional input spaces. We conclude that a GMM base distribution is an attractive alternative to the standard base, whose inclusion incurs little to no cost and whose parameterisation may assist with more reliable out-of-distribution likelihoods.
Submission Number: 6
Loading