Keywords: Diffusion Models, Flow Matching, Generative models, Memorization, Generalization, Riemannian Geometry, Bayesian Inference
Abstract: How to balance memorisation and generalisation in generative models remains an open task. To investigate this, we employ Bayesian methods, which have recently been proposed to predict the uncertainty of generated samples. In our work, we employ the Riemannian Laplace approximation, from which we can sample generative models that resemble the trained one. Our geometry-aware approach yields improved results compared to the Euclidean counterpart.
Submission Number: 10
Loading