Abstract: We are interested in privatizing an approximate posterior inference algorithm, called Expectation
Propagation (EP). EP approximates the posterior distribution by iteratively refining
approximations to the local likelihood terms. By doing so, EP typically provides better posterior
uncertainties than variational inference (VI) which globally approximates the likelihood
term. However, EP needs a large memory to maintain all local approximations associated
with each datapoint in the training data. To overcome this challenge, stochastic expectation
propagation (SEP) considers a single unique local factor that captures the average effect of
each likelihood term to the posterior and refines it in a way analogous to EP. In terms of
privatization, SEP is more tractable than EP. It is because at each factor’s refining step we
fix the remaining factors, where these factors are independent of other datapoints, which is
different from EP. This independence makes the sensitivity analysis straightforward. We
provide a theoretical analysis of the privacy-accuracy trade-off in the posterior distributions
under our method, which we call differentially private stochastic expectation propagation
(DP-SEP). Furthermore, we test the DP-SEP algorithm on both synthetic and real-world
datasets and evaluate the quality of posterior estimates at different levels of guaranteed
privacy.
Submission Length: Regular submission (no more than 12 pages of main content)
Previous TMLR Submission Url: https://openreview.net/forum?id=Gjt07VwAqP¬eId=wE9fIwT9XB
Changes Since Last Submission: Code repository anonymized accordingly to the past submission recommendation
Assigned Action Editor: ~Yu-Xiang_Wang1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 165
Loading