SGD on Random Mixtures: Private Machine Learning under Data Breach ThreatsDownload PDF

12 Feb 2018 (modified: 05 May 2023)ICLR 2018 Workshop SubmissionReaders: Everyone
Abstract: We propose Stochastic Gradient Descent on Random Mixtures (SGDRM) as a simple way of protecting data under data breach threats. We show that SGDRM converges to the globally optimal point for deep neural networks with linear activations while being differentially private. We also train nonlinear neural networks with private mixtures as the training data, proving the practicality of SGDRM.
Keywords: SGD on random mixtures, SGDRM, differential privacy
TL;DR: SGDRM is the SGD algorithm run on random mixtures; it is differentially private and has convergence guarantees.
4 Replies

Loading