Keywords: Federated Learning, Edge Computing, Communication-Efficient, Knowledge Distillation
Abstract: Federated learning (FL) is a popular distributed machine learning framework for edge computing. However, it faces a significant challenge: the communication overhead caused by frequent model updates between clients and the central server. Previous studies have overlooked a crucial piece of information: the central server already knows the initial model on each client before local training begins in every round. This oversight leads to significant redundancy in communication, as full model information are transmitted unnecessarily. To address this, we propose a novel framework called \textit{model update distillation} (MUD), which leverages this prior knowledge to decouple model parameters from the network architecture. Instead of transmitting raw parameter updates, our method synthesizes and transmits compact tensor sequences that encode only the essential information for synchronization. This dramatically reduces communication overhead while still allowing recipients to accurately reconstruct the intended model updates. Extensive experimental results demonstrate that FedMUD achieves substantial improvements in communication efficiency, making it a highly effective solution for federated learning in bandwidth-constrained environments. The PyTorch-like core code can be found in \ref{alg: pytorch}.
Primary Area: alignment, fairness, safety, privacy, and societal considerations
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5551
Loading