GraphPrivatizer: Improved Structural Differential Privacy for Graph Neural Networks

Published: 01 Oct 2024, Last Modified: 01 Oct 2024Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Graph privacy is crucial in systems that present a graph structure where the confidentiality and privacy of participants play a significant role in the integrity of the system itself. For instance, it is necessary to ensure the integrity of banking systems and transaction networks, protecting the privacy of customers' financial information and transaction details. We propose a method called GraphPrivatizer that privatizes the structure of a graph and protects it under Differential Privacy. GraphPrivatizer performs a controlled perturbation of the graph structure by randomly replacing the neighbors of a node with other similar neighbors, according to some similarity metric. With regard to neighbor perturbation, we find that aggregating features to compute similarities and imposing a minimum similarity score between the original and the replaced nodes provides the best privacy-utility trade-off. We use our method to train a Graph Neural Network server-side without disclosing users' private information to the server. We conduct experiments on real-world graph datasets and empirically evaluate the privacy of our models against privacy attacks.
Submission Length: Long submission (more than 12 pages of main content)
Changes Since Last Submission: In the camera ready version we added: the OpenReview URL, a link to the code, the acceptance date, and the acknowledgments.
Code: https://github.com/pindri/gnn-structural-privacy
Assigned Action Editor: ~Giannis_Nikolentzos1
Submission Number: 2702
Loading