Decision Tree for Locally Private Estimation with Public Data

Published: 21 Sept 2023, Last Modified: 01 Jan 2024NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: Local differential privacy, non-parametric regression, decision tree, public data
TL;DR: Use a small amount of public data to enhance the performance of locally private regression.
Abstract: We propose conducting locally differentially private (LDP) estimation with the aid of a small amount of public data to enhance the performance of private estimation. Specifically, we introduce an efficient algorithm called Locally differentially Private Decision Tree (LPDT) for LDP regression. We first use the public data to grow a decision tree partition and then fit an estimator according to the partition privately. From a theoretical perspective, we show that LPDT is $\varepsilon$-LDP and has a mini-max optimal convergence rate under a mild assumption of similarity between public and private data, whereas the lower bound of the convergence rate of LPDT without public data is strictly slower, which implies that the public data helps to improve the convergence rates of LDP estimation. We conduct experiments on both synthetic and real-world data to demonstrate the superior performance of LPDT compared with other state-of-the-art LDP regression methods. Moreover, we show that LPDT remains effective despite considerable disparities between public and private data.
Supplementary Material: pdf
Submission Number: 11907
Loading