Privacy-Utility Trade-offs in Neural Networks for Medical Population Graphs: Insights from Differential Privacy and Graph Structure

Published: 28 Oct 2023, Last Modified: 21 Dec 2023NeurIPS 2023 GLFrontiers Workshop PosterEveryoneRevisionsBibTeX
Keywords: Differential Privacy, Graph Neural Networks, Medical Population Graphs
TL;DR: We investigate differentially private graph neural networks on population graphs in the medical domain, exploring the impact of the graph structure on performance
Abstract: Differential privacy (DP) is the gold standard for protecting individuals' data while enabling deep learning. It is well-established and frequently used for applications in medicine and healthcare to protect sensitive patient data. When using graph deep learning on so-called population graphs, however, the application of DP becomes more challenging compared to grid-like data structures like images or tables. In this work, we initiate an empirical investigation of differentially private graph neural networks on population graphs in the medical domain by examining privacy-utility trade-offs under different graph learning methods on both real-world and synthetic datasets. We compare two state-of-the-art methods for differentially private graph deep learning and empirically audit privacy guarantees through node membership inference and link stealing attacks. We focus on the impact of the graph structure, one of the most important inherent challenges of medical population graphs. Our findings highlight the potential and challenges of this specific DP application area. Moreover, we find that the underlying graph structure constitutes a potential factor for larger performance gaps on one of the explored methods by showing a correlation between the graphs' homophily and the accuracy of the trained model.
Submission Number: 28