Keywords: differential privacy, diffusion-based sampling, gaussian differential privacy, EDM
TL;DR: We provide a systematic privacy analysis of diffusion sampling by modeling each step with Gaussian DP and analyzing their total privacy composition.
Abstract: Diffusion models have emerged as the foundation of modern generative systems, yet their high memorization capacity raises privacy concerns. While differentially private (DP) training provides formal guarantees, it remains impractical for large-scale diffusion models. In this work, we take a different route by analyzing privacy leakage during the sampling process. We introduce an empirical denoiser that enables tractable computation of per-step sensitivities, allowing each denoising step to be interpreted as a Gaussian mechanism. Building on this perspective, we apply Gaussian Differential Privacy (GDP) to derive tight privacy bounds. Furthermore, we identify critical windows in the denoising trajectory—time steps where salient semantic features emerge—and quantify how privacy loss depends on stopping relative to these windows. Our study provides the first systematic characterization of privacy guarantees in diffusion sampling, offering a principled foundation for designing privacy-preserving generative pipelines beyond DP training.
Primary Area: alignment, fairness, safety, privacy, and societal considerations
Submission Number: 4582
Loading