Hierarchical Neural Simulation-Based Inference Over Event Ensembles

Published: 11 Feb 2024, Last Modified: 15 Mar 2024Accepted by TMLREveryoneRevisionsBibTeX
Abstract: When analyzing real-world data it is common to work with event ensembles, which comprise sets of observations that collectively constrain the parameters of an underlying model of interest. Such models often have a hierarchical structure, where ``local'' parameters impact individual events and ``global'' parameters influence the entire dataset. We introduce practical approaches for frequentist and Bayesian dataset-wide probabilistic inference in cases where the likelihood is intractable, but simulations can be realized via a hierarchical forward model. We construct neural estimators for the likelihood(-ratio) or posterior and show that explicitly accounting for the model's hierarchical structure can lead to significantly tighter parameter constraints. We ground our discussion using case studies from the physical sciences, focusing on examples from particle physics and cosmology.
Submission Length: Regular submission (no more than 12 pages of main content)
Supplementary Material: zip
Changes Since Last Submission: We thank all reviewers and the action editor for engaging thoughtfully with the paper. As suggested, we have removed the term "optimal", including at the level of the abstract, and replaced it with the term "hierarchy-aware". We have also moved the qualification of the term from a footnote on the first page into the main text, recognizing that this is a critical aspect of the paper. Finally, we have added example images drawn from the astrophysics forward model (Fig. 5) in order to clarify this example a bit more.
Video: https://indico.in2p3.fr/event/30589/timetable/#14-optimal-dataset-wide-infere
Code: https://github.com/smsharma/hierarchical-inference
Assigned Action Editor: ~Alp_Kucukelbir1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 1712