Noise is All You Need: Solving Linear Inverse Problems by Noise Combination Sampling with Diffusion Models

20 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Diffusion Model, Inverse Problem, Generative Model, Optimization
TL;DR: We address the stability issue in solving general linear inverse problems based on diffusion models by approximating the noise term in DDPM with the projection of its measurement score onto the noise subspace.
Abstract: Pretrained diffusion models have demonstrated strong capabilities in zero-shot inverse problem solving by incorporating observation information into the generation process of the diffusion models. However, this inevitably presents a dilemma: excessive integration can disrupt the generative process, while insufficient integration fails to emphasize the constraints imposed by the inverse problem. To address this, we propose $\textit{Noise Combination Sampling}$, a novel method that synthesizes an optimal noise vector from a noise subspace to approximate the measurement score function, replacing the noise term in the standard Denoising Diffusion Probabilistic Models process. This enables conditional information to be naturally embedded into the generation process without reliance on step-wise hyperparameter tuning. Our method can be applied to a wide range of inverse problem solvers, including image compression, and, in most scenarios, especially when the number of generation steps $T$ is small, achieves superior performance with negligible computational overhead, significantly improving robustness and stability.
Primary Area: generative models
Submission Number: 24376
Loading