Differential Privacy with Manifold Data DependencyDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Differential privacy, data correlation
Abstract: In this paper, we study dataset processing mechanisms generated by linear queries in the presence of manifold data dependency. Specifically, the input data are assumed to lie in an affine manifold as prior knowledge known to adversaries. First of all, we show such manifold data dependency may have a significant impact on the privacy levels compared to the case with the manifold constraint being absent. We establish necessary and sufficient conditions on the possibility of achieving differential privacy via structured noise injection mechanisms where non i.i.d. Gaussian or Laplace noises are calibrated into dataset. Next, in light of these conditions, procedures are developed by which a prescribed privacy budget can be tightly reached with a matching noise level. Finally, we show that the framework has immediate applications in differentially private cloud-based control, where the manifold data dependency arises naturally from the system dynamics, and the proposed theories and procedures become effective tools in evaluating privacy levels and in the design of provably useful algorithms.
One-sentence Summary: A framework that explores the manifold dependency in input data to produce tighter differential privacy guarantees.
Supplementary Material: zip
16 Replies

Loading