Abstract: Diffusion generative models unlock new possibilities for inverse problems as they allow for the incorporation of strong empirical priors into the process of scientific inference. Recently, diffusion models are repurposed for solving inverse problems using Gaussian approximations to conditional densities of the reverse process via Tweedie’s formula to parameterise the mean, complemented with various heuristics. To address various challenges arising from these approximations, we leverage higher order information using Tweedie’s formula and obtain a statistically principled approximation. We further provide a theoretical guarantee specifically for posterior sampling which can lead to better theoretical understanding of diffusion-based conditional sampling. Finally, we illustrate the empirical effectiveness of our approach for general linear inverse problems on toy synthetic examples as well as image restoration. We show that our method (i) removes any time-dependent step-size hyperparameters required by earlier methods, (ii) brings stability and better sample quality across multiple noise levels, (iii) is the only method that works in a stable way with variance exploding (VE) forward processes as opposed to earlier works.
Certifications: Featured Certification
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: Added link to code to run all experiments, including instructions on how to run them. Removed blue font of changes since last submission, added acknowledgements, deanonymised.
Code: https://github.com/bb515/tmpdtorch
Assigned Action Editor: ~Jasper_Snoek1
Submission Number: 2971
Loading