Keywords: federated learning, differential privacy, private quantiles, distributionally robust learning
TL;DR: Federated quantile computation with DP: the asymptotically suboptimal algorithm performs better under problem sizes of interest
Abstract: The computation of analytics in a federated environment plays an increasingly important role in data science and machine learning. We consider the differentially private computation of the quantiles of a distribution of values stored on a population of clients. We present two quantile estimation algorithms based on the distributed discrete Gaussian mechanism compatible with secure aggregation. Based on a privacy-utility analysis and numerical experiments, we delineate the regime under which each one is superior. We find that the algorithm with suboptimal asymptotic performance works the best on moderate problem sizes typical in federated learning with client sampling. We apply these algorithms to augment distributionally robust federated learning with differential privacy.
Is Student: No