DP-LFlow: Differentially Private Latent Flow for Scalable Sensitive Image Generation

Published: 23 Jun 2023, Last Modified: 05 Jul 2023DeployableGenerativeAIEveryoneRevisions
Keywords: differential privacy, generative model, normalizing flow
TL;DR: An efficient and effective approach to train a differentially private flow generative model
Abstract: Differentially private generative model (DPGM) is designed to generate data that are distributionally similar to the original sensitive data yet with differential privacy (DP) guarantees. While GAN attracts major attention, existing DPGMs based on flow generative models are limited and only developed on low-dimensional tabular datasets. The capability of *exact* density estimation makes the flow model exceptional especially when density estimation is of interest. In this work, we will first show that it is challenging (or even infeasible) to train a DP-flow via DP-SGD, i.e. the workhorse algorithm for private deep learning, on high-dimensional image sets with acceptable utility, and then we give an effective solution by reducing the generation from the pixel space to a lower dimensional latent space. We show the effectiveness and scalability of the proposed method via extensive experiments. Notably, our method is scalable to high-resolution image sets, which is rarely studied in related works.
Submission Number: 17
Loading