Gaussian Differential Privacy Transformation: from identification to applicationDownload PDF

Published: 28 Jan 2022, Last Modified: 13 Feb 2023ICLR 2022 SubmittedReaders: Everyone
Keywords: differential privacy, gaussian differential privacy, privacy profile
Abstract: Gaussian differential privacy (GDP) is a single-parameter family of privacy notions that provides coherent guarantees to avoid the exposure of individuals from machine learning models. Relative to traditional $(\epsilon,\delta)$-differential privacy (DP), GDP is more interpretable and tightens the bounds given by standard DP composition theorems. In this paper, we start with an exact privacy profile characterization of $(\epsilon,\delta)$-DP and then define an efficient, tractable, and visualizable tool, called the Gaussian differential privacy transformation (GDPT). With theoretical property of the GDPT, we develop an easy-to-verify criterion to characterize and identify GDP algorithms. Based on our criterion, an algorithm is GDP if and only if an asymptotic condition on its privacy profile is met. By development of numerical properties of the GDPT, we give a method to narrow down possible values of an optimal privacy measurement $\mu$ with an arbitrarily small and quantifiable margin of error. As applications of our newly developed tools, we revisit some established \ed-DP algorithms and find that their utility can be improved. We additionally make a comparison between two single-parameter families of privacy notions, $\epsilon$-DP and $\mu$-GDP. Lastly, we use the GDPT to examine the effect of subsampling under the GDP framework.
18 Replies

Loading