An Unconditional Representation of the Conditional Score in Infinite Dimensional Linear Inverse Problems
Abstract: Score-based diffusion models (SDMs) have emerged as a powerful tool for sampling from the posterior distribution in Bayesian inverse problems. However, existing methods often require multiple evaluations of the forward mapping to generate a single sample, resulting in significant computational costs for large-scale inverse problems. To address this, we propose an unconditional representation of the conditional score-function (UCoS) tailored to linear inverse problems, which avoids forward model evaluations during sampling by shifting computational effort to an offline training phase. In this phase, a task-dependent score function is learned based on the linear forward operator. Crucially, we show that the conditional score can be derived exactly from a trained (unconditional) score using affine transformations, eliminating the need for conditional score approximations. Our approach is formulated in infinite-dimensional function spaces, making it inherently discretization-invariant. We support this formulation with a rigorous convergence analysis that justifies UCoS beyond any specific discretization. Finally we validate UCoS through high-dimensional computed tomography (CT) and image deblurring experiments, demonstrating both scalability and accuracy.
Submission Type: Long submission (more than 12 pages of main content)
Changes Since Last Submission: We have addressed the reviewers’ comments, and the revision includes the following **major** changes:
- We now provide a visualization of the per-coordinate noise levels of $\Sigma_t$ for both a masking operator and the CT problem.
- We added an inpainting problem with a Gaussian prior, in which we explicitly solve the score functions to demonstrate the fidelity of both UCoS and the Conditional method.
- We improved the presentation by introducing the main contribution in a finite-dimensional setting before moving to the more technical infinite-dimensional case. The exposition has been clarified through a more explicit separation of high-level ideas from detailed derivations.
- We expanded the discussion of sample complexity for the score functions in both the Conditional approach and UCoS, and clarified the associated runtime comparison.
- Several figures previously placed in the appendix have been moved into the main text to better illustrate the advantages of UCoS.
- We moderated claims related to the numerical experiments by clarifying that no computational acceleration is achieved over the Conditional approach when comparable FNO architectures are used, and that differences in sample quality across methods remain relatively inconclusive.
- We corrected various minor typographical issues.
Code: https://github.com/FabianSBD/SBD-task-dependent/tree/UCoS
Assigned Action Editor: ~Alain_Durmus1
Submission Number: 5213
Loading