Keywords: Differential privacy, non-convex optimization, saddle points
Abstract: This paper addresses the challenge of achieving second-order stationary points (SOSP) in differentially private stochastic non-convex optimization. We identify two key limitations in the state-of-the-art: (i) inaccurate error rates caused by the omission of gradient variance in saddle point escape analysis, resulting in inappropriate parameter choices and overly optimistic performance estimates, and (ii) inefficiencies in private SOSP selection via the AboveThreshold algorithm, particularly in distributed learning settings, where perturbing and sharing Hessian matrices introduces significant additional noise. To overcome these challenges, we revisit perturbed stochastic gradient descent (SGD) with Gaussian noise and propose a new framework that leverages general gradient oracles. This framework introduces a novel criterion based on model drift distance, ensuring provable saddle point escape and efficient convergence to approximate local minima with low iteration complexity. Using an adaptive SPIDER as the gradient oracle, we establish a new DP algorithm that corrects existing error rates. Furthermore, we extend our approach to a distributed adaptive SPIDER, applying our framework to distributed learning scenarios and providing the first theoretical results on achieving SOSP under differential privacy in distributed environments with heterogeneous data. Finally, we analyze the limitations of the AboveThreshold algorithm for private model selection in distributed learning and show that as model dimensions increase, the selection process introduces additional errors, further demonstrating the superiority of our proposed framework.
Primary Area: alignment, fairness, safety, privacy, and societal considerations
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6580
Loading