Abstract: Learning a privacy-preserving model from sensitive data which are distributed across multiple devices is an increasingly important problem. The problem is often formulated in the federated learning context, with the aim of learning a single global model while keeping the data distributed. Moreover, Bayesian learning is a popular approach for modelling, since it naturally supports reliable uncertainty estimates. However, Bayesian learning is generally intractable even with centralised non-private data and so approximation techniques such as variational inference are a necessity. Variational inference has recently been extended to the non-private federated learning setting via the partitioned variational inference algorithm. For privacy protection, the current gold standard is called differential privacy. Differential privacy guarantees privacy in a strong, mathematically clearly defined sense.
In this paper, we present differentially private partitioned variational inference, the first general framework for learning a variational approximation to a Bayesian posterior distribution in the federated learning setting while minimising the number of communication rounds and providing differential privacy guarantees for data subjects.
We propose three alternative implementations in the general framework, one based on perturbing local optimisation runs done by individual parties, and two based on perturbing updates to the global model (one using a version of federated averaging, the second one adding virtual parties to the protocol), and compare their properties both theoretically and empirically. We show that perturbing the local optimisation works well with simple and complex models as long as each party has enough local data. However, the privacy is always guaranteed independently by each party. In contrast, perturbing the global updates works best with relatively simple models. Given access to suitable secure primitives, such as secure aggregation or secure shuffling, the performance can be improved by all parties guaranteeing privacy jointly.
Submission Length: Long submission (more than 12 pages of main content)
Changes Since Last Submission: - Added omitted definitions: exponential family (Section 3.1), Gaussian mechanism, subsampling (Section 3.2).
- Added some more clarifications and details (e.g. about data typically being a vector of reals in Section 3.1).
- Added additional plot (Figure 5) on the effect of the number of local data partitions with local averaging to Appendix.
- Added fixes (including more explicit refs to several theorems) to improve readability and make them more self-contained.
- Moved Related work Section to beginning.
- Fixed all notational overlaps, inconsistencies, and typos.
Code: https://github.com/DPBayes/DPPVI
Assigned Action Editor: ~Gautam_Kamath1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 450
Loading