Abstract: Markov chain Monte Carlo (MCMC) requires only the ability to evaluate the likelihood, making it a common technique for inference in complex models. However, it can have a slow mixing rate, requiring the generation of many samples to obtain good estimates and an overall high computational cost. Shrek MCMC is a multi-fidelity layered MCMC method that exploits lower-fidelity approximations of the true likelihood calculation to improve mixing and leads to overall faster performance. Such lower-fidelity likelihoods are commonly available in scientific and engineering applications where the model involves a simulation whose resolution or accuracy can be tuned. Our technique uses recursive, layered chains with simple layer tuning; it does not require the likelihood to take any form or have any particular internal mathematical structure. We demonstrate experimentally that Shrek MCMC achieves larger effective sample sizes for the same computational time across different scientific domains including hydrology and cosmology.
Submission Length: Long submission (more than 12 pages of main content)
Previous TMLR Submission Url: https://openreview.net/forum?id=JfLLmVqWfL
Changes Since Last Submission: The previous submission used a 11 point type for the font by accident. We changed it to be 10 point type as specified in the style file.
Assigned Action Editor: ~Trevor_Campbell1
Submission Number: 4345
Loading