Sparse Federated Learning with Hierarchical Personalization ModelsDownload PDFOpen Website

2022 (modified: 09 Jun 2022)CoRR 2022Readers: Everyone
Abstract: Federated learning (FL) can achieve privacy-safe and reliable collaborative training without collecting users' private data. Its excellent privacy security potential promotes a wide range of FL applications in Internet-of-Things (IoT), wireless networks, mobile devices, and autonomous vehicles. However, the FL method suffers from poor model performance on non-IID data and excessive traffic volume. We propose a personalized FL algorithm using a hierarchical proximal mapping based on L2-norm, named sparse federated learning with hierarchical personalized models (sFedHP), which significantly improves the global model performance facing diverse data. An approximated L1-norm is well used as the sparse constraint to reduce the communication cost. Convergence analysis shows that sFedHP's convergence rate is state-of-the-art with linear speedup and the sparse constraint only reduces the convergence rate to a small extent while significantly reducing the communication cost. Experimentally, we demonstrate the benefits of sFedHP compared with the FedAvg, HierFAVG (hierarchical FedAvg), and personalized FL methods based on local customization, including FedAMP, FedProx, Per-FedAvg, pFedMe, and pFedGP.
0 Replies

Loading