DP-MicroAdam: Private and Frugal Algorithm for Training and Fine-tuning

Published: 22 Sept 2025, Last Modified: 22 Sept 2025WiML @ NeurIPS 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Differential Privacy, Private Fine-Tuning, Adam
Abstract: We introduce DP-MicroAdam, a differentially private variant of Adam for effective private training and fine-tuning. DP-MicroAdam combines sparsity-aware updates with adaptive scaling to mitigate the effects of clipping and noise. Empirically, it shows robustness to hyperparameters such as the clipping threshold and removes the need for de-biasing techniques. On CIFAR-10 with Wide-ResNets, DP-MicroAdam achieves 84.8% test accuracy under $(8,10^{-5})$-DP, surpassing the previous state-of-the-art. Ongoing work extends evaluation to larger datasets and private fine-tuning scenarios, where sparsity and adaptivity are expected to provide the greatest benefits.
Submission Number: 117
Loading