Keywords: Diffusion Models, Inverse Problems, Constrained Generation
TL;DR: We present a method to embed differentiable constraints as inductive biases in a score matching diffusion model
Abstract: Diffusion models struggle to produce samples that respect constraints, a common requirement in scientific applications. Existing approaches have introduced regularization terms in the loss or guidance methods during sampling to enforce such constraints, but they bias the generative model away from the true data distribution. This is a problem, especially when the constraint is misspecified, a common issue when formulating constraints on scientific data. In this paper, instead of changing the loss or the sampling loop, we integrate a guidance-inspired adjustment into the denoiser itself, giving it a soft inductive bias towards constraint-compliant samples. Through experiments, we show that these softly constrained denoisers exploit the constraint knowledge to produce compliant samples, while maintain enough flexibility to deviate from it when there is misspecification with observed data.
Supplementary Material: zip
Primary Area: generative models
Submission Number: 7071
Loading