Log-Concave and Multivariate Canonical Noise Distributions for Differential PrivacyDownload PDF

Published: 31 Oct 2022, Last Modified: 19 Dec 2022NeurIPS 2022 AcceptReaders: Everyone
Keywords: Gaussian differential privacy, tradeoff function, composition, group privacy
TL;DR: We consider existence and construction of additive distributions that fully use the privacy budget under different DP frameworks
Abstract: A canonical noise distribution (CND) is an additive mechanism designed to satisfy $f$-differential privacy ($f$-DP), without any wasted privacy budget. $f$-DP is a hypothesis testing-based formulation of privacy phrased in terms of tradeoff functions, which captures the difficulty of a hypothesis test. In this paper, we consider the existence and construction of both log-concave CNDs and multivariate CNDs. Log-concave distributions are important to ensure that higher outputs of the mechanism correspond to higher input values, whereas multivariate noise distributions are important to ensure that a joint release of multiple outputs has a tight privacy characterization. We show that the existence and construction of CNDs for both types of problems is related to whether the tradeoff function can be decomposed by functional composition (related to group privacy) or mechanism composition. In particular, we show that pure $\epsilon$-DP cannot be decomposed in either way and that there is neither a log-concave CND nor any multivariate CND for $\epsilon$-DP. On the other hand, we show that Gaussian-DP, $(0,\delta)$-DP, and Laplace-DP each have both log-concave and multivariate CNDs.
Supplementary Material: pdf
11 Replies

Loading