A Stochastic Gradient Langevin Dynamics Algorithm For Noise Intrinsic Federated LearningDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Withdrawn SubmissionReaders: Everyone
Abstract: Non-i.i.d data distribution and Differential privacy(DP) protections are two open problems in Federated Learning(FL). We address these two problems by proposing the first noise intrinsic FL training algorithms. In our proposed algorithm, we incorporate a stochastic gradient Langevin dynamices(SGLD) oracle in local node's parameter update phase. Our introduced SGLD oracle would lower generalization errors in local node's parameter learning and provide local node DP protections. We theoretically analyze our algorithm by formulating a min-max objective functions and connects its upper bound with global loss function in FL. The convergence of our algorithm on non-convex function is also given as contraction and coupling rate of two random process defined by stochastic differential equations(SDE) We would provide DP analysis for our proposed training algorithm and provide more experiment results soon.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Reviewed Version (pdf): https://openreview.net/references/pdf?id=3jUkHBZoR
5 Replies

Loading