Federated Experiment Design under Distributed Differential Privacy

Published: 19 Jun 2023, Last Modified: 21 Jul 2023FL-ICML 2023EveryoneRevisionsBibTeX
Keywords: Differential Privacy, Causal Inference, Treatment Effect, Secure Aggregation, Communication
Abstract: Experiment design has a rich history dating back to the early 1920s and has found numerous critical applications across various fields since then. However, the use and collection of users' data in experiments often involve sensitive personal information, so additional measures to protect individual privacy are required during data collection, storage, and usage. In this work, we focus on the rigorous protection of users' privacy (under the notion of differential privacy (DP)) while minimizing the trust toward service providers. Specifically, we consider the estimation of the average treatment effect (ATE) under Neyman's potential outcome framework under DP and secure aggregation, a distributed protocol enabling a service provider to aggregate information without accessing individual data. To achieve DP, we design local privatization mechanisms that are compatible with secure aggregation. We show that when introducing DP noise, it is imperative to 1) cleverly split privacy budgets to estimate both the mean and variance of the outcomes and 2) carefully calibrate the confidence intervals according to the DP noise. Finally, we present comprehensive experimental evaluations of our proposed schemes and show the privacy-utility trade-offs in experiment design.
Submission Number: 89