Generative Modeling with Bayesian Sample Inference

03 May 2026 (modified: 05 May 2026)Under review for TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: We derive a novel generative model from iterative Gaussian posterior inference. By treating the generated sample as an unknown variable, we can formulate the sampling process in the language of Bayesian probability. Our model uses a sequence of prediction and posterior update steps to iteratively narrow down the unknown sample starting from a broad initial belief. In addition to a rigorous theoretical analysis, we establish a connection between our model and diffusion models and show that it includes Bayesian Flow Networks (BFNs) as a special case. In our experiments, we demonstrate that our model improves sample quality on ImageNet32 over both BFNs and the closely related Variational Diffusion Models, while achieving equivalent log-likelihoods on ImageNet32 and ImageNet64.
Submission Type: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Andreas_Lehrmann1
Submission Number: 8736
Loading