FAST‑DIPS: Adjoint‑Free Analytic Steps and Hard‑Constrained Likelihood Correction for Diffusion‑Prior Inverse Problems
Keywords: inverse problem, image reconstruction, diffusion models
Abstract: $\textbf{FAST-DIPS}$ is a training-free solver for diffusion-prior inverse problems, including nonlinear forward operators. At each noise level, a pretrained denoiser provides an anchor $ \textbf{x} _ {0|t} $; we then perform a hard-constrained proximal correction in measurement space (AWGN) by solving
$\min_\mathbf{x} \tfrac{1}{2\gamma_t}\|\mathbf{x}-\mathbf{x}_{0|t}\|^2 \ \text{s.t.} \|\mathcal{A}(\mathbf{x})-\mathbf{y}\|\le\varepsilon$.
The correction is implemented via an adjoint-free ADMM with a closed-form projection onto the Euclidean ball and a few steepest-descent updates whose step size is analytic and computable from one VJP and one JVP—or a forward-difference surrogate—followed by decoupled re-annealing. We show this step minimizes a local quadratic model (with backtracking-based descent), any ADMM fixed point satisfies KKT for the hard-constraint, and mode substitution yields a bounded time-marginal error. We also derive a latent variant $\mathcal{A}\mapsto\mathcal{A}\circ\mathcal{D}$ and a one-parameter pixel\(\rightarrow\)latent hybrid schedule. Across eight linear and nonlinear tasks, FAST-DIPS matches or surpasses training-free baselines while reducing wall-clock by $5\times$–$25\times$, requiring only autodiff access to \(A\) and no hand-coded adjoints or inner MCMC.
Primary Area: generative models
Submission Number: 24185
Loading