Abstract: Denoising diffusion models have driven significant progress in the field of Bayesian inverse problems. Recent approaches use pre-trained diffusion models as priors to solve a wide range of such problems, only leveraging inference-time compute and thereby eliminating the need to retrain task-specific models on the same dataset. To approximate the posterior of a Bayesian inverse problem, a diffusion model samples from a sequence of intermediate posterior distributions, each with an intractable likelihood function. This work proposes a novel mixture approximation of these intermediate distributions. Since direct gradient-based sampling of these mixtures is infeasible due to intractable terms, we propose a practical method based on Gibbs sampling. We validate our approach through extensive experiments on image inverse problems, utilizing both pixel- and latent-space diffusion priors, as well as on source separation with an audio diffusion model. The code is available at \url{https://www.github.com/badr-moufad/mgdm}.
Lay Summary: This work introduces a new method for solving inverse problems, which are tasks where one tries to recover a hidden cause from observed data—like reconstructing a clear image from a blurry one, or separating individual sounds from a mixed audio recording.
Traditionally, solving these problems in a Bayesian way (which means estimating not just one answer, but a distribution of likely answers) has been very computationally expensive. Recent advances use diffusion models—a powerful class of generative AI models—to do this more efficiently by treating them as flexible prior assumptions about what realistic signals (like images or audio) look like.
However, diffusion models don’t directly provide the solution: they sample from a sequence of gradually refined guesses. This process relies on complex intermediate distributions that are hard to work with mathematically. To address this, the authors propose a new way of approximating these distributions using mixtures (combinations of simpler components), and introduce a practical technique—based on Gibbs sampling—to generate samples from them.
Link To Code: https://www.github.com/badr-moufad/mgdm
Primary Area: Probabilistic Methods
Keywords: Diffusion Models, Guidance, Inverse Problems, Monte Carlo methods
Submission Number: 3964
Loading