Strong Data Processing Inequalities for Locally Differentially Private Mechanisms

Published: 25 Jun 2023, Last Modified: 06 Feb 20242023 IEEE International Symposium on Information Theory (ISIT)EveryoneCC BY 4.0
Abstract:

We investigate the strong data processing inequalities of locally differentially private mechanisms under a specific $f$-divergence, namely the $\sE_\gamma$-divergence. More specifically, we characterize an upper bound on the $\sE_\gamma$-divergence between $P\sK$ and $Q\sK$, the output distributions of an $\eps$-LDP mechanism $\sK$, in terms of the $\sE_\gamma$-divergence between the corresponding input distributions $P$ and $Q$. Interestingly, the tightest such upper bound in the binary case turns out to have a non-multiplicative form. We then extend our results to derive a tight upper bound for general $f$-divergences. As an application of our main findings, we derive a lower bound on the locally private Bayesian estimation risk that is tighter than the available divergence-based bound in the literature.

Loading