An Efficient Federated Learning Framework for Training Semantic Communication Systems

Published: 01 Jan 2024, Last Modified: 13 Nov 2024IEEE Trans. Veh. Technol. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Semantic communication has emerged as a pillar for the next generation of communication systems due to its capabilities in alleviating data redundancy. Most semantic communication systems are built upon advanced deep learning models whose training performance heavily relies on data availability. Existing studies often make unrealistic assumptions of a readily accessible data source, where in practice, data is mainly created on the client side. This circumstance limits the transmission of data due to privacy concerns, which is necessary for conventional centralized training schemes. To address this challenge, we explore semantic communication in a federated learning (FL) setting, which can harness the data of clients without compromising their privacy. Additionally, we design our system to lower the communication overhead by reducing the quantity of information delivered in each FL training round. This not only allows us to save significant bandwidth for resource-constrained devices but also reduces overall network traffic. Finally, we introduce a mechanism to aggregate the global model based on the client's performance, which we have referred to as FedLol. Extensive simulation results demonstrate the effectiveness of our proposed technique compared to baseline methods.
Loading