BSR-FL: An Efficient Byzantine-Robust Privacy-Preserving Federated Learning Framework

Published: 01 Jan 2024, Last Modified: 13 May 2025IEEE Trans. Computers 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Federated learning (FL) is a technique that enables clients to collaboratively train a model by sharing local models instead of raw private data. However, existing reconstruction attacks can recover the sensitive training samples from the shared models. Additionally, the emerging poisoning attacks also pose severe threats to the security of FL. However, most existing Byzantine-robust privacy-preserving federated learning solutions either reduce the accuracy of aggregated models or introduce significant computation and communication overheads. In this paper, we propose a novel Blockchain-based Secure and Robust Federated Learning (BSR-FL) framework to mitigate reconstruction attacks and poisoning attacks. BSR-FL avoids accuracy loss while ensuring efficient privacy protection and Byzantine robustness. Specifically, we first construct a lightweight non-interactive functional encryption (NIFE) scheme to protect the privacy of local models while maintaining high communication performance. Then, we propose a privacy-preserving defensive aggregation strategy based on NIFE, which can resist encrypted poisoning attacks without compromising model privacy through secure cosine similarity and incentive-based Byzantine-tolerance aggregation. Finally, we utilize the blockchain system to assist in facilitating the processes of federated learning and the implementation of protocols. Extensive theoretical analysis and experiments demonstrate that our new BSR-FL has enhanced privacy security, robustness, and high efficiency.
Loading