Keywords: Federated Learning, Attentive Pruning, Heterogeneous Clients, Non-IID Data.
TL;DR: A personalized ATTENtive pruning enabled federateD learnING (ATTENDING) method to address heterogeneity challenge.
Abstract: Federated Learning (FL) emerges as a novel machine learning paradigm, enabling distributed clients to collaboratively train a global model while eliminating local data transmission. Despite its advantages, FL faces challenges posed by system and data heterogeneity. System heterogeneity prevents low-end clients from participating in FL with uniform models, while data heterogeneity adversely impacts the learning performance of FL. In this paper, we propose the personalized ATTENtive pruning enabled federateD learnING (ATTENDING) to collectively address these heterogeneity challenges. Specifically, we first design an attention module incorporating spatial and channel attention to enhance the learning performance on heterogeneous data. Subsequently, we introduce the attentive pruning algorithm to generate personalized local models guided by attention scores, aiming to facilitate clients' participation in FL. Finally, we introduce a specific heterogeneous aggregation algorithm integrated with an attention matching mechanism to efficiently aggregate the pruned models. We implement ATTENDING with a real FL platform and the evaluation results show that ATTENDING significantly outperforms the baselines by up to 11.3\% and reduces the average model footprints by 32\%. Our code is available at: https://anonymous.4open.science/r/ATTENDING.
Primary Area: other topics in machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6227
Loading