Keywords: diffusion models, generative models, sampling, stochastic differential equations, total variation, wasserstein distance, denoising score matching, kernel estimation
Abstract: We propose a two-stage pipeline for high dimensional time series generation: (i) nonparametric kernel estimation for the conditional first and second moments of the underlying data increments to recover residuals, and (ii) score-based diffusion model trained on these residuals. We derive finite-time convergence estimates for reverse-time sampling in both total variation (TV) and Wasserstein-2 ($W_2$), with explicit dependence on the variance preserving noise schedule. Experiments on synthetic multivariate processes validate: (a) empirical TV and $W_2$ track the theoretical upper bounds, and (b) Monte Carlo estimates of test functionals achieve the predicted standard errors.
Primary Area: generative models
Submission Number: 21211
Loading