Likelihood-Free Inference with Deep Gaussian ProcessesDownload PDF

Published: 06 Jul 2022, Last Modified: 22 Oct 2023NeurIPS 2020 Deep Inverse Workshop PosterReaders: Everyone
Keywords: Likelihood-Free Inference, Bayesian Deep Learning, Bayesian Optimization, Gaussian Processes
TL;DR: Deep GPs in Bayesian Optimization solve Likelihood-free inference tasks with irregular distributions in hundreds simulator calls.
Abstract: In recent years, surrogate models have been successfully used in likelihood-free inference to decrease the number of simulator evaluations. The most data-efficient solution for this task has been achieved by Bayesian Optimization with Gaussian Processes (GPs). While this combination works well for unimodal target distributions, it appears restrictive in more irregular cases. On the other hand, neural network approaches are extremely adaptable given sufficient data, which are rarely available when working with computationally expensive simulators. In this extended abstract, we address a trade-off between data-efficiency and flexibility by proposing a Deep Gaussian Process (DGP) surrogate model that can handle more irregularly behaved target distributions with few simulator evaluations. Our experiments show how DGPs can outperform GPs on objective functions with multimodal distributions, maintaining a comparable performance in unimodal cases. At the same time, DGPs in general require much fewer data to achieve the same performance as Mixture Density Networks and Masked Autoregressive Flows. This confirms that DGPs as surrogate models for Bayesian Optimization provide a good tradeoff between data-efficiency and flexibility for likelihood-free inference with computationally intensive simulators.
Conference Poster: pdf
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2006.10571/code)
0 Replies

Loading