Guided Star-Shaped Masked Diffusion

ICLR 2026 Conference Submission25294 Authors

20 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Discrete Diffusion, Text Diffusion Models, Masked Diffusion, Guided Sampling
TL;DR: We developed a new sampling algorithm that, with minimal fine-tuning, enables pre-trained diffusion models to self-correct, significantly boosting quality in few-step generation.
Abstract: The performance of pre-trained masked diffusion models is often constrained by their sampling procedure, which makes decisions irreversible and struggles in low-step generation regimes. We introduce a novel sampling algorithm that works with pre-trained models and, after a lightweight fine-tuning of a single layer, significantly improves sample quality and efficiency. Our method reformulates the generation process using a star-shaped paradigm, which inherently allows for error correction. To make this process effective, we augment it with a learnable re-masking scheduler that intelligently identifies and revises likely errors. This approach yields a substantial quality boost, particularly when using a small number of sampling steps. We extensively ablate key components of our approach and show its usability in different scenarios. In comprehensive experiments on text, and code generation, our sampling algorithm outperforms or matches existing methods.
Supplementary Material: zip
Primary Area: generative models
Submission Number: 25294
Loading