Reducing Memorisation in Generative Models via Riemannian Bayesian Inference

Published: 31 Oct 2025, Last Modified: 28 Nov 2025EurIPS 2025 Workshop PriGMEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Diffusion Models, Flow Matching, Generative models, Memorization, Generalization, Riemannian Geometry, Bayesian Inference
Abstract: How to balance memorisation and generalisation in generative models remains an open task. To investigate this, we employ Bayesian methods, which have recently been proposed to predict the uncertainty of generated samples. In our work, we employ the Riemannian Laplace approximation, from which we can sample generative models that resemble the trained one. Our geometry-aware approach yields improved results compared to the Euclidean counterpart.
Submission Number: 10
Loading