Latent-IMH: Efficient Bayesian Inference for Inverse Problems with Approximate Operators
Abstract: We study sampling from posterior distributions in Bayesian linear inverse problems where $\mathbf{A}$, the parameters to observables operator, is computationally expensive. In many applications $\mathbf{A}$ can be factored in a manner that facilitates the construction of a cost-effective approximation $\widetilde{\mathbf{A}}$. In this framework, we introduce Latent-IMH, a sampling method based on the Metropolis-Hastings independence (IMH) sampler. \newimh{} first generates intermediate latent variables using the approximate $\widetilde{\mathbf{A}}$, and then refines them using the exact $\mathbf{A}$. Its primary benefit is that it shifts the computational cost to an offline phase. We theoretically analyze the performance of Latent-IMH using KL divergence and mixing time bounds. Using numerical experiments on several model problems, we show that, under reasonable assumptions, it outperforms state-of-the-art methods such as the No-U-Turn sampler (NUTS) in computational efficiency. In some cases Latent-IMH can be orders of magnitude faster than existing schemes.
Submission Number: 1965
Loading