Improving Constrained Language Generation via Self-Distilled Twisted Sequential Monte Carlo

NeurIPS 2025 Workshop FPI Submission24 Authors

Published: 23 Sept 2025, Last Modified: 25 Nov 2025FPI-NEURIPS2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Track: Main Track
Keywords: Twisted Sequential Monte Carlo, Constrained Language Generation, Self Distillation
Abstract: Recent work has framed constrained text generation with autoregressive language models as a probabilistic inference problem. Among these, Zhao et al. (2024) introduced a promising approach based on twisted Sequential Monte Carlo, which incorporates learned twist functions and twist-induced proposals to guide the generation process. However, in constrained generation settings where the target distribution concentrates on outputs that are unlikely under the base model, learning becomes challenging due to sparse and uninformative reward signals. We show that iteratively refining the base model through self-distillation alleviates this issue by making the model progressively more aligned with the target, leading to substantial gains in generation quality.
Submission Number: 24
Loading