Approximate Kernel Density Estimation under Metric-based Local Differential Privacy

Published: 26 Apr 2024, Last Modified: 13 Jun 2024UAI 2024 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: kernel density estimation, local differential privacy, locality-sensitive hashing
Abstract: Kernel Density Estimation (KDE) is a fundamental problem with broad machine learning applications. In this paper, we investigate the KDE problem under Local Differential Privacy (LDP), a setting in which users privatize data on their own devices before sending them to an untrusted server for analytics. To strike a balance between ensuring local privacy and preserving high-utility KDE results, we adopt a relaxed definition of LDP based on metrics (mLDP), which is suitable when data points are represented in a metric space and can be more distinguishable as their distances increase. To the best of our knowledge, approximate KDE under mLDP has not been explored in the existing literature. We propose the mLDP-KDE framework, which augments a locality-sensitive hashing-based sketch method to provide mLDP and answer any KDE query unbiasedly within an additive error with high probability in sublinear time and space. Extensive experimental results demonstrate that the mLDP-KDE framework outperforms several existing KDE methods under LDP and mLDP by achieving significantly better trade-offs between privacy and utility, with particularly remarkable advantages on large, high-dimensional data.
List Of Authors: Zhou, Yi and Wang, Yanhao and Teng, Long and Huang, Qiang and Chen, Cen
Code Url: https://github.com/yz2022/mldp-kde
Submission Number: 352
Loading