Relaxed Marginal Consistency for Differentially Private Query AnsweringDownload PDF

May 21, 2021 (edited Oct 25, 2021)NeurIPS 2021 PosterReaders: Everyone
  • Keywords: differential privacy, convex optimization, graphical models, approximate inference, local polytope
  • TL;DR: We propose a post-processing technique that boosts utility by enforcing (local) consistency constraints. Our method is scalable to far more general settings than prior work.
  • Abstract: Many differentially private algorithms for answering database queries involve a step that reconstructs a discrete data distribution from noisy measurements. This provides consistent query answers and reduces error, but often requires space that grows exponentially with dimension. PRIVATE-PGM is a recent approach that uses graphical models to represent the data distribution, with complexity proportional to that of exact marginal inference in a graphical model with structure determined by the co-occurrence of variables in the noisy measurements. PRIVATE-PGM is highly scalable for sparse measurements, but may fail to run in high dimensions with dense measurements. We overcome the main scalability limitation of PRIVATE-PGM through a principled approach that relaxes consistency constraints in the estimation objective. Our new approach works with many existing private query answering algorithms and improves scalability or accuracy with no privacy cost.
  • Supplementary Material: pdf
  • Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
  • Code: https://github.com/ryan112358/private-pgm/tree/approx-experiments-snapshot
9 Replies

Loading