Improved Sample Complexity for Private Nonsmooth Nonconvex Optimization

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: Improved algorithms for differentially private nonsmooth nonconvex optimization
Abstract: We study differentially private (DP) optimization algorithms for stochastic and empirical objectives which are neither smooth nor convex, and propose methods that return a Goldstein-stationary point with sample complexity bounds that improve on existing works. We start by providing a single-pass $(\epsilon,\delta)$-DP algorithm that returns an $(\alpha,\beta)$-stationary point as long as the dataset is of size $\widetilde{\Omega}(\sqrt{d}/\alpha\beta^{3}+d/\epsilon\alpha\beta^{2})$, which is $\Omega(\sqrt{d})$ times smaller than the algorithm of Zhang et al. (2024) for this task, where $d$ is the dimension. We then provide a multi-pass polynomial time algorithm which further improves the sample complexity to $\widetilde{\Omega}\left(d/\beta^2+d^{3/4}/\epsilon\alpha^{1/2}\beta^{3/2}\right)$, by designing a sample efficient ERM algorithm, and proving that Goldstein-stationary points generalize from the empirical loss to the population loss.
Lay Summary: We design algorithms that solve data-dependent optimization problems which lack smoothness and convexity, so that even after seeing the solution to the problem, the data itself remains private. Such optimization problems arise regularly when training neural networks, when one wants to maintain the privacy of the training data. The privacy of the data is measured via a well-studied notion called differential privacy, and the returned solution is an approximate stationary point of the loss function. A private algorithm that solves such problems was previously suggested by Zhang et al. (2024). The private algorithms we propose and analyze in this work find such solutions, using less data. Equivalently, given the same amount of data, they find solutions with higher accuracy than previous algorithms.
Primary Area: Social Aspects->Privacy
Keywords: Differential privacy, nonconvex optimization, nonsmooth optimization, Goldstein stationarity
Submission Number: 7563
Loading