Score-based generative models learn manifold-like structures with constrained mixingDownload PDF

Published: 29 Nov 2022, Last Modified: 06 Oct 2023SBM 2022 PosterReaders: Everyone
Keywords: manifolds, generative models, analysis, score matching
Abstract: How do score-based generative models (SBMs) learn the data distribution supported on a lower-dimensional manifold? We investigate the score model of a trained SBM through its linear approximations and subspaces spanned by local feature vectors. During diffusion as the noise decreases, the local dimensionality increases and become more varied between different sample sequences. Importantly, we find that the learned vector field mixes images by a non-conservative field within the manifold, although it denoises with normal projections as if there is a potential function in off-manifold directions. At each noise level, the subspace spanned by the local features overlap with an effective density function. These observations suggest that SBMs can flexibly mix samples with the learned score field while carefully maintaining a manifold-like structure of the data distribution.
Student Paper: No
1 Reply