Keywords: Differential Privacy, Privacy-Preserving Machine Learning, Means, Quantiles
Abstract: Differential privacy is widely adopted to provide provable privacy guarantees in data analysis. We consider the problem of combining public and private data (and, more generally, data with heterogeneous privacy needs) for estimating aggregate statistics. We introduce a mixed estimator of the mean optimized to minimize the variance. We argue that our mechanism is preferable to techniques that preserve the privacy of individuals by subsampling data proportionally to the privacy needs of users. Similarly, we present a mixed median estimator based on the exponential mechanism. We compare our mechanisms to the methods proposed in Jorgensen et al. . Our experiments provide empirical evidence that our mechanisms often outperform the baseline methods.
Paper Under Submission: The paper is NOT under submission at NeurIPS