Fed-MS: Fault Tolerant Federated Edge Learning with Multiple Byzantine Servers

Published: 2024, Last Modified: 05 Jan 2026ICDCS 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Due to its decentralized framework and outdoor environments, federated edge learning (FEEL) faces significant vulnerability to malicious attacks within edge networks. Prevailing FEEL approaches typically hinge on a dependable parameter server (PS) to contend with the adversarial updates from Byzantine clients. Recognizing the inherent unreliability of PSs in edge networks, this paper delves into the security challenges of FEEL, specifically addressing Byzantine PSs. We present a Byzantine fault-tolerant FEEL algorithm, named Fed-MS, in which a multi-server technique along with a newly designed trimmed-mean-based model filter is employed. This combination ensures that each client can obtain a feasible global model for its local training, closely approximating a true model aggregated by benign PSs. Furthermore, we propose a sparse uploading strategy in Fed-MS to enhance communication efficiency for model aggregation to multiple PSs. Theoretical analysis demonstrates that, when Byzantine PSs are a minority, Fed-MS achieves an expected convergence speed of $O(1/T)$ with $T$ defined as the number of training rounds, akin to state-of-the-art works under non-Byzantine settings. Extensive experiments are conducted on the CIFAR-10 dataset with MobileNet V2 as the training model. The numerical results show that our Fed-MS can improve the model accuracy from 10% to at least 76% under the malicious attacks from Byzantine PSs. Our code is released at https://github.com/haoma2772/Fed-MS.
Loading