Efficient Federated Learning on Knowledge Graphs via Privacy-preserving Relation Embedding AggregationDownload PDF

Published: 27 Mar 2022, Last Modified: 03 Jul 2024FL4NLP@ACL2022Readers: Everyone
Keywords: Knowledge Graph Embedding, Federated Learning, Data Privacy
Abstract: Federated Learning (FL) on knowledge graphs (KGs) has yet to be as well studied as other domains, such as computer vision and natural language processing. A recent study FedE first proposes an FL framework that shares entity embeddings of KGs across all clients. However, compared with model sharing in vanilla FL, entity embedding sharing from FedE would incur severe privacy leakage. Specifically, the known entity embedding can be used to infer whether a specific relation between two entities exists in a private client. In this paper, we first develop a novel attack that aims to recover the original data based on embedding information, which is further used to evaluate the vulnerabilities of FedE. Furthermore, we propose a Federated learning paradigm with privacy-preserving Relation embedding aggregation (FedR) to tackle the privacy issue in FedE. Compared to entity embedding sharing, relation embedding sharing policy can significantly reduce the communication cost due to its smaller size of queries. We conduct extensive experiments to evaluate FedR with five different embedding learning models and three benchmark KG datasets. Compared to FedE, FedR achieves similar utility and significant (nearly 2 X) improvements in both privacy and efficiency on link prediction task.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/efficient-federated-learning-on-knowledge/code)
5 Replies

Loading