Projected Low-Rank Gradient in Diffusion-based Models for Inverse Problems

Published: 30 Sept 2024, Last Modified: 30 Oct 2024D3S3 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Diffusion models, Inverse problems, Robustness
TL;DR: Improving the performance and robustness of diffusion-based inverse problems using a projected low-rank gradient method
Abstract: Recent advancements in diffusion models have demonstrated their potential as powerful learned data priors for solving inverse problems. A popular Bayesian approach leverages diffusion sampling steps for inducing a data prior, generating images from noise while incorporating measurement gradient updates at each step to impose data consistency. However, diffusion models exhibit high sensitivity to measurement gradient step size and face challenges in preserving the process on the manifold, leading to performance degradation and artifact introduction in the sampled posterior. We propose a Projected Low-Rank Gradient (PLoRG) method, approximating the data manifold structure to enhance the performance and robustness of diffusion models in solving inverse problems. Our approach leverages singular value decomposition to approximate the measurement gradient in a lower-rank subspace defined by the current state, effectively preserving the manifold structure and filtering out artifact-inducing components. In addition to superior robustness, we show that PLoRG improves the performance of diffusion models on a range of linear and nonlinear inverse problems, especially those that are inherently challenging such as phase retrieval.
Submission Number: 30
Loading