Patio: Framework for Private Release of Ratios

22 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Desk Rejected SubmissionEveryoneRevisionsBibTeX
Keywords: privacy, ratio, average
TL;DR: A new algorithm with improved utility for privately computing ratios
Abstract: Averages and ratios are some of the most basic primitives in data analytics, statistics, and machine learning. In this work, we study the differentially private (DP) release of ratios. For tasks for which the numerator $a(\cdot)$ and denominator $b(\cdot)$ satisfy a certain general co-monotonicity property, we give a new mechanism \emph{Patio} (Private rATIO) for privately releasing the ratio $a(\mathbf{x})/b(\mathbf{x})$ for an input dataset $\mathbf{x}$, with strong theoretical guarantees and practical performance. We also prove that under general conditions on $a(\cdot)$ and $b(\cdot)$, the variance of our mechanism matches up to a $1+o(1)$ factor the variance of the Laplace distribution scaled with the \emph{local} sensitivity. This is in contrast with the standard Laplace mechanism, which scales the noise with---the potentially much larger---\emph{global} sensitivity. Our algorithm can be applied to a variety of tasks and settings including estimating averages, the Jaccard similarity coefficient, and several metrics quantifying the utility of a classifier such as its precision, sensitivity, specificity and $F$-score. For the above-mentioned statistics, our MSE matches that of the Laplace distribution scaled to the local sensitivity of the given task. We perform empirical evaluation showing the better utility of our algorithm compared to natural and state-of-the-art baselines.
Primary Area: societal considerations including fairness, safety, privacy
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5794
Loading