Interleaved Gibbs Diffusion for Constrained Generation

Published: 06 Mar 2025, Last Modified: 15 Apr 2025ICLR 2025 DeLTa Workshop PosterEveryoneRevisionsBibTeXCC BY 4.0
Track: long paper (up to 8 pages)
Keywords: Discrete Diffusion Models, Planning, Constrained Generation, Generative AI
TL;DR: Mixed mode diffusion model which de-noises co-ordinate by co-ordinate and does not assume factorizability of the posterior. This allows very strong performance on planning and constrained generation tasks.
Abstract: We introduce Interleaved Gibbs Diffusion (IGD), a novel generative modeling framework for mixed continuous-discrete data, focusing on constrained generation problems. Prior works on discrete and continuous-discrete diffusion models assume factorized denoising distribution for fast generation, which can hinder the modeling of strong dependencies between random variables encountered in constrained generation. IGD moves beyond this by interleaving continuous and discrete denoising algorithms via a discrete time Gibbs sampling type Markov chain. IGD provides flexibility in the choice of denoisers, allows conditional generation via state-space doubling and inference time scaling via the ReDeNoise method. Empirical evaluations on three challenging tasks—solving 3-SAT, generating molecule structures, and generating layouts—demonstrate state-of-the-art performance. Notably, IGD achieves a 7\% improvement on 3-SAT out of the box and achieves state-of-the-art results in molecule generation without relying on equivariant diffusion or domain-specific architectures. We explore a wide range of modeling, and interleaving strategies along with hyperparameters in each of these problems.
Submission Number: 81
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview