Abstract: Differential privacy has emerged as a significant cornerstone in the realm of scientific hypothesis testing utilizing confidential data. When data are not confidential, Bayesian tests are widely used in reporting scientific discoveries, as they effectively address the key criticisms of p-values, namely lack of interpretability and inability to quantify evidence in support of competing hypotheses. In this article, we introduce a novel framework for differentially private Bayesian hypothesis testing, thereby expanding the applicability of Bayesian testing to confidential data. This framework naturally arises from a principled data-generative mechanism, ensuring that the resulting inferences retain interpretability while maintaining privacy. Further, by focusing on differentially private Bayes factors based on test statistics, we circumvent the need to model the complete data generative mechanism and ensure substantial computational benefits. We also provide a set of sufficient conditions to establish Bayes factor consistency under the proposed framework. Finally, the utility of the proposed methodology is showcased via several numerical experiments.
Loading