Certifying Private Probabilistic Mechanisms

Published: 01 Jan 2024, Last Modified: 16 May 2025CRYPTO (6) 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: In past years, entire research communities have arisen to address concerns of privacy and fairness in data analysis. At present, however, the public must trust that institutions will re-implement algorithms voluntarily to account for these social concerns. Due to additional cost, widespread adoption is unlikely without effective legal enforcement. A technical challenge for enforcement is that the methods proposed are often probabilistic mechanisms, whose output must be drawn according to precise, and sometimes secret, distributions. The Differential Privacy (DP) case is illustrative: if a cheating curator answers queries according to an overly-accurate mechanism, privacy violations could go undetected. This raises our central question: Can we efficiently certify the output of a probabilistic mechanism enacted by an untrusted party? To this end:
Loading