An automatic differentiation system for the age of differential privacyDownload PDF

Sep 17, 2021 (edited Oct 20, 2021)PRIML 2021 PosterReaders: Everyone
  • TL;DR: Method for utilisation of sensitivity tracking for optimal noise calibration in DP
  • Abstract: We introduce Tritium, an automatic differentiation-based sensitivity analysis framework for differentially private (DP) machine learning (ML). Optimal noise calibration in this setting requires efficient Jacobian matrix computations and tight bounds on the L2-sensitivity. Our framework achieves these objectives by relying on a functional analysis-based method for sensitivity tracking, which we briefly outline. This approach interoperates naturally and seamlessly with static graph-based automatic differentiation, which enables order-of-magnitude improvements in compilation times compared to previous work. Moreover, we demonstrate that optimising the sensitivity of the entire computational graph at once yields substantially tighter estimates of the true sensitivity compared to interval bound propagation techniques. Our work naturally befits recent developments in DP such as individual privacy accounting, aiming to offer improved privacy-utility trade-offs, and represents a step towards the integration of accessible machine learning tooling with advanced privacy accounting systems.
  • Paper Under Submission: The paper is NOT under submission at NeurIPS
1 Reply