Why DP "LOCAL" SGD – Faster Convergence in Less Composition with Clipping Bias Reduction

27 Sept 2024 (modified: 12 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Differential Privacy, Local Stochastic Gradient Descent, Clipping Bias
Abstract: We argue to apply Differentially-Private Local Stochastic Gradient Descent (DP-LSGD), a generalization of regular DP-SGD with per-sample local iterations, to systematically improve privacy-preserving machine learning. We prove and show the following facts in this paper: a). DP-LSGD with local iterations can produce more concentrated per-sample updates and therefore enables a more efficient exploitation of the clipping budget with a better utility-privacy tradeoff; b). given the same $T$ privacy composition or per-sample update aggregation, with properly-selected local iterations, DP-LSGD can converge faster in $O(1/T)$ to a small neighborhood of (local) optimum compared to $O(1/\sqrt{T})$ in regular DP-SGD, i.e., DP-LSGD produces the same accuracy while consumes less of the privacy budget. From an empirical side, thorough experiments are provided to support our developed theory and we show DP-LSGD produces the best-known performance in various practical deep learning tasks: for example with an $(\epsilon=4,\delta=10^{-5})$-DP guarantee, we successfully train ResNet20 from scratch with test accuracy $74.1\%, 86.5\%$ and $91.7\%$ on CIFAR10, SVHN and EMNIST, respectively. Our code is released in an anonymous GitHub link.
Primary Area: alignment, fairness, safety, privacy, and societal considerations
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 10186
Loading