Keywords: Amortized Bayesian Inference, Hierarchical Models, Compositional Modeling, Score Matching
TL;DR: We extend amortized Bayesian inference to hierarchical models using compositional score matching with adaptive solvers and a novel error-damping estimator.
Abstract: Amortized Bayesian inference (ABI) with neural networks has emerged as a powerful simulation-based approach for estimating complex mechanistic models.
However, extending ABI to hierarchical models, a cornerstone of modern Bayesian analysis, has been a major hurdle due to the need to simulate and process massive datasets.
Our study tackles these challenges by extending compositional score matching (CSM), a divide-and-conquer strategy for Bayesian updating using diffusion models.
We develop a new error-damping estimator to address previous stability issues of CSM when aggregating large numbers of data points.
We first verified the numerical stability with up to 100,000 data points on a controlled benchmark.
We then evaluated our method on a hierarchical AR model, achieving competitive performance to direct ABI baselines on smaller problem sizes while using less than one full model simulation for larger problem sizes.
Finally, we address a large-scale inverse problem in advanced microscopy with over 750,000 parameters, demonstrating its relevance to real scientific applications.
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Submission Number: 17728
Loading