Keywords: Graph Neural Network, Differential Privacy
Abstract: Graph Neural Networks (GNNs) with Differential Privacy (DP) guarantees have been proposed to preserve privacy when nodes contain sensitive information that needs to be kept private but is critical for training. Existing methods deploy a fixed uniform noise generation mechanism that lacks the flexibility to adjust between nodes, leading to increasing the risk of graph information leakage and decreasing the model's overall performance. To address the above challenges, we propose NIP-GNN, a Node-level Individual Private GNN with DP guarantee based on the adaptive perturbation over sensitive components to safeguard node information. First, we propose a Topology-based Node Influence Estimation (TNIE) method to infer unknown node influence with neighborhood and centrality awareness.
Second, given the obtained node influence rank, an adaptive private aggregation method is proposed to perturb neighborhood embeddings directed by node-wise influence.
Third, we propose to privately train the graph learning algorithm over perturbed aggregations in adaptive residual connection mode over multi-layer convolution for node-wise tasks. Theoretically, analysis ensures that NIP-GNN satisfies DP guarantee. Empirical experiments over real-world graph datasets show that NIP-GNN
presents a better resistance over node inference attacks and
achieves a better trade-off between privacy and accuracy.
Primary Area: alignment, fairness, safety, privacy, and societal considerations
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 871
Loading