Efficient and Privacy-Preserving Ranking-Based Federated Learning

Published: 01 Jan 2024, Last Modified: 17 Apr 2025ICA3PP (4) 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Recently, a lot of works have proposed privacy-preserving schemes to address the privacy issues in federated learning (FL). However, FL also faces the challenge of high communication overhead due to limited client resources (e.g., mobile phones and wearable devices), so minimizing the communication between FL servers and clients is necessary. Although some existing works have solved this problem, they mainly focus on reducing the upload communication from client to server, while the entire model is used in the download communication from server to client. In this paper, we propose EPRFL to address this issue. Specifically, the client uses local data to rank the neural network parameters provided by the server, and a voting mechanism and homomorphic encryption are leveraged to aggregate and encrypt the rankings. The server then aggregates the encrypted local rankings. In addition, we use super-increasing sequences to compress and package the local rankings efficiently, further reducing communication costs. Finally, we demonstrate the security of EPRFL through security analysis and its high communication efficiency by experiments.
Loading