Differentially Private Algorithms for the Stochastic Compositional Optimization Problem

Published: 03 Feb 2026, Last Modified: 06 Feb 2026AISTATS 2026 PosterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: In this paper, we study the stochastic compositional optimization problem under the constraint of differential privacy. We first introduce two private algorithms: noisy stochastic compositional gradient descent (NSCGD) and the noisy stochastically corrected stochastic compositional gradient (NSCSC) method. We use the algorithmic stability approach to establish bounds on the excess population loss of both methods in strongly convex and convex cases. However, these methods require gradient computations that are super-linear in the number of training samples. To address this, we propose a class of output perturbation-based randomized algorithms by exploiting the stability of the compositional empirical risk minimizer under the privacy constraint. These algorithms achieve comparable excess population risk with significantly reduced gradient computations.
Submission Number: 576
Loading