Improved Differentially Private Riemannian Optimization: Fast Sampling and Variance Reduction

Published: 21 Feb 2023, Last Modified: 28 Feb 2023Accepted by TMLREveryoneRevisionsBibTeX
Abstract: A common step in differentially private ({DP}) Riemannian optimization is sampling from the (tangent) Gaussian distribution as noise needs to be generated in the tangent space to perturb the gradient. In this regard, existing works either use the Markov chain Monte Carlo ({MCMC}) sampling or explicit basis construction based sampling methods on the tangent space. This becomes a computational bottleneck in the practical use of {DP} Riemannian optimization, especially when performing stochastic optimization. In this paper, we discuss different sampling strategies and develop efficient sampling procedures by exploiting linear isometry between tangent spaces and show them to be orders of magnitude faster than both the {MCMC} and sampling using explicit basis construction. Furthermore, we develop the {DP} Riemannian stochastic variance reduced gradient algorithm and compare it with DP Riemannian gradient descent and stochastic gradient descent algorithms on various problems.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: We have done a **major revision** of the paper to address all the concerns of the reviewers and to improve the overall readability of the paper. In particular, we have done the following. * We have revised Sections 3 and 4 thoroughly. We now explicitly mention the proposed sampling algorithms for the manifolds of interest. * We have added additional experiments in Section 6 as requested and in addition, a new baseline `Explicit-Sparse'. * We have added a discussion subsection in Section 5 comparing DP-RSVRG with DP-RSGD and DP-RGD. * We have proof-read the paper for typos and omissions.
Assigned Action Editor: ~Ivan_Oseledets1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 475
Loading