Privately Publishable Per-instance PrivacyDownload PDF

21 May 2021, 20:43 (edited 28 Jan 2022)NeurIPS 2021 PosterReaders: Everyone
  • Keywords: differential privacy, private ERM, per-instance privacy, objective perturbation
  • TL;DR: We calculate the per-instance privacy loss of releasing a private empirical risk minimizer via objective perturbation, and propose methods to privately and accurately publish the per-instance privacy losses at little to no additional privacy cost.
  • Abstract: We consider how to privately share the personalized privacy losses incurred by objective perturbation, using per-instance differential privacy (pDP). Standard differential privacy (DP) gives us a worst-case bound that might be orders of magnitude larger than the privacy loss to a particular individual relative to a fixed dataset. The pDP framework provides a more fine-grained analysis of the privacy guarantee to a target individual, but the per-instance privacy loss itself might be a function of sensitive data. In this paper, we analyze the per-instance privacy loss of releasing a private empirical risk minimizer learned via objective perturbation, and propose a group of methods to privately and accurately publish the pDP losses at little to no additional privacy cost.
  • Supplementary Material: pdf
  • Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
12 Replies

Loading