Keywords: Protein Generation, RNA Generation, diffusion language models, discrete diffusion models
TL;DR: We introduce Path Planning, a masked diffusion model sampling method that improves decoding order in biological sequence generation.
Abstract: In this paper, we investigate how the order in which tokens are unmasked during masked diffusion model (MDM) inference affects generative quality. We derive an expanded evidence lower bound (ELBO) that introduces a planner, responsible for selecting which tokens to unmask at each step. Our analysis suggests that alternative unmasking strategies can improve generative performance. Based on these insights, we propose Path Planning (P2), a training-free inference framework that leverages pre-trained BERT or the denoiser itself to guide unmasking decisions. P2 generalizes all known MDM sampling strategies and enables significant improvements across diverse domains including language generation (in-context learning, code generation, story infilling, mathematical reasoning, reverse curse correction) and biological sequence generation (protein and RNA sequences).
Submission Number: 27
Loading