Fast Estimation for Privacy and Utility in Differentially Private Machine LearningDownload PDF

28 Sept 2020 (modified: 20 Nov 2023)ICLR 2021 Conference Withdrawn SubmissionReaders: Everyone
Keywords: machine learning, privacy, parameter selection
Abstract: Recently, differential privacy has been widely studied in machine learning due to its formal privacy guarantees for data analysis. As one of the most important parameters of differential privacy, $\epsilon$ controls the crucial tradeoff between the strength of the privacy guarantee and the utility of model. Therefore, the choice of $\epsilon$ has a great influence on the performance of differentially private learning models. But so far, there is still no rigorous method for choosing $\epsilon$. In this paper, we deduce the influence of $\epsilon$ on utility private learning models through strict mathematical derivation, and propose a novel approximate approach for estimating the utility of any $\epsilon$ value. We show that our approximate approach has a fairly small error and can be used to estimate the optimal $\epsilon$ according to the expected utility of users. Experimental results demonstrate high estimation accuracy and broad applicability of our approximate approach.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
One-sentence Summary: An approximate approach for efficiently estimating the privacy and utility of differentially private machine learning algorithms.
Supplementary Material: zip
Reviewed Version (pdf): https://openreview.net/references/pdf?id=Q9dBOZUTNv
35 Replies

Loading