Discrete Feynman-Kac Correctors

Published: 09 Jul 2025, Last Modified: 16 Jul 2025AI4Math@ICML25 PosterEveryoneRevisionsBibTeXCC BY-NC-SA 4.0
Keywords: discrete diffusion models, inference time control, amortized machine learning, product of experts, annealing
TL;DR: An inference time method that allows sampling from the product or annealed distributions for discrete diffusion models, applied to improving amortized regression with diffusion language models
Abstract: The performance of Large Language Models (LLMs) directly depends on the size of the context that the model was trained on. Despite significant progress in increasing the context size of the current models, some applications remain bottlenecked by the number of processed tokens at inference time. A particular mathematical problem LLMs can be used for is inferring parameters in a statistical model, given data-points as input. Here we make a case demonstrating that discrete diffusion models offer a promising avenue for scaling such parameter prediction tasks, by combining the outputs of the same model evaluated on different parts of the training data. We propose Discrete Fenyman-Kac Correctors --- a framework that allows for controlling the generated distribution of discrete masked diffusion models at inference time. We derive Sequential Monte Carlo (SMC) algorithms that, given a trained discrete diffusion model, sample from its annealed distribution or the product of distributions with different conditions. Notably, our framework does not require any training, finetuning and external reward functions. Finally, we apply our framework to amortized linear regression using LLaDA and demonstrate that it drastically outperforms the standard inference procedure in terms of accuracy and adherence to prompt format.
Submission Number: 166
Loading