Differential Privacy via Group ShufflingDownload PDF

17 Sept 2021, 04:00 (modified: 17 Sept 2021, 05:14)PRIML 2021 OralReaders: Everyone
Keywords: differential privacy, shuffle privacy, models of privacy, binary summation, local privacy
TL;DR: We introduce and initiate the study of the group shuffle model of differential privacy, which interpolates between local and shuffle privacy.
Abstract: The past decade has seen data privacy emerge as a fundamental and pressing issue. Among the tools developed to tackle it, differential privacy has emerged as a central and principled framework, with specific variants capturing various threat models. In particular, the recently proposed shuffle model of differential privacy allows for promising tradeoffs between accuracy and privacy. However, the shuffle model may not be suitable in all situations, as it relies on a distributed setting where all users can coordinate and trust (or simulate) a joint shuffling algorithm. To address this, we introduce a new model, the group shuffle model, in which users are partitioned into several groups, each group having its own local shuffler. We investigate the privacy/accuracy tradeoffs in our model, by comparing it to both the shuffle and local models of privacy, which it some sense interpolates between. In addition to general relations between group shuffle, shuffle, and local privacy, we provide a detailed comparison of the cost and benefit of the group shuffle model, by providing both upper and lower bounds for the specific task of binary summation.
Paper Under Submission: The paper is NOT under submission at NeurIPS
1 Reply