Dirichlet Mechanism for Differentially Private KL Divergence Minimization

Published: 21 Feb 2023, Last Modified: 28 Feb 2023Accepted by TMLREveryoneRevisionsBibTeX
Abstract: Given an empirical distribution $f(x)$ of sensitive data $x$, we consider the task of minimizing $F(y) = D_{\text{KL}} (f(x)\Vert y)$ over a probability simplex, while protecting the privacy of $x$. We observe that, if we take the exponential mechanism and use the KL divergence as the loss function, then the resulting algorithm is the $Dirichlet\text{ }mechanism$ that outputs a single draw from a Dirichlet distribution. Motivated by this, we propose a Rényi differentially private (RDP) algorithm that employs the Dirichlet mechanism to solve the KL divergence minimization task. In addition, given $f(x)$ as above and $\hat{y}$ an output of the Dirichlet mechanism, we prove a probability tail bound on $D_{\text{KL}} (f(x)\Vert \hat{y})$, which is then used to derive a lower bound for the sample complexity of our RDP algorithm. Experiments on real-world datasets demonstrate advantages of our algorithm over Gaussian and Laplace mechanisms in supervised classification and maximum likelihood estimation.
Submission Length: Regular submission (no more than 12 pages of main content)
Code: https://github.com/donlapark/Dirichlet-Mechanism
Assigned Action Editor: ~Gautam_Kamath1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 662
Loading