FedANC: Adaptive Sparse Noise Scheduling for Federated Differential Privacy

18 Sept 2025 (modified: 18 Jan 2026)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Federated Learning, Differential Privacy, Adaptive Noise Controller, Sparse Gradient Perturbation
Abstract: Federated Learning (FL) enables multiple clients to collaboratively train a shared model without sharing raw data. Although this reduces direct exposure of local data, model updates can still leak sensitive information through gradient-based attacks. Differential Privacy (DP) mitigates this risk by adding calibrated noise to updates, providing formal guarantees. However, most existing DP-FL methods adopt fixed noise scales and uniform injection across all gradient dimensions, without adapting to client heterogeneity or training dynamics. This often results in poor privacy-utility trade-offs. To overcome these limitations, we propose FEDANC, an adaptive differential privacy framework for FL. It consists of three components: (i) an Adaptive Noise Controller (ANC) with an LSTM-based design that generates client-specific noise scales and sparsity ratios from local training feedback; (ii) a Selective Noise Injection mechanism that perturbs only the most sensitive gradient entries; and (iii) a Privacy Budget Regularization term that aligns per-round updates with a predefined privacy target. For stability, the ANC is pretrained with synthetic feedback that simulates typical training behavior. We provide theoretical guarantees on both convergence and differential privacy. Extensive experiments demonstrate that FEDANC achieves higher accuracy, faster convergence, and stronger privacy protection compared with existing approaches.
Primary Area: alignment, fairness, safety, privacy, and societal considerations
Submission Number: 10134
Loading