First Provable Guarantees for Practical Private FL: Beyond Restrictive Assumptions

Published: 10 Jun 2025, Last Modified: 01 Jul 2025TTODLer-FM @ ICML 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Federated Learning, Differential Privacy, Distributed Optimization, Smoothed Normalization
TL;DR: We introduce Fed-α-NormEC: The first practical private FL framework with provable guarantees for partial participation and local updates, free from restrictive assumptions.
Abstract: Federated Learning (FL) enables collaborative training on decentralized data. Differential Privacy (DP) is crucial for FL, but current private methods often rely on unrealistic assumptions (e.g., bounded gradients or heterogeneity), hindering practical application. Existing works that relax these assumptions typically neglect practical FL mainstays like partial client participation or multiple local updates. We introduce Fed- $\alpha$-NormEC, the first differentially private FL framework providing provable convergence and DP guarantees under standard assumptions while fully supporting these practical elements. Fed- $\alpha$ NormEC integrates local updates (full and incremental gradient steps), separate server and client stepsizes, and, crucially, partial client participa-tion-essential for real-world deployment and vital for privacy amplification. Our theoretical guarantees are corroborated by experiments on private deep learning tasks.
Submission Number: 29
Loading