Cooperative Multiple Model Training for Personalized Federated Learning over Heterogeneous Devices

Published: 26 Aug 2024, Last Modified: 26 Aug 2024FedKDD 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Federated Learning, Heterogeneous Devices, Multiple Models
Abstract: Federated learning (FL) is an increasingly popular paradigm for protecting data privacy in machine learning systems. However, the data heterogeneity and high computation cost/latency are challenging barriers for employing FL in real-world applications with heterogeneous devices. In this paper, we propose a novel personalized FL framework named $\mathtt{CompFL}$ allowing cooperative training of models with varied structures to mitigate those issues. First, $\mathtt{CompFL}$ initializes a set of expert models in varied sizes and allows each client to choose one or multiple expert models for training according to its capacity. Second, $\mathtt{CompFL}$ combines the model decoupling strategy and local-global feature alignment to mitigate the adverse impact of label heterogeneity, where clients only share the representation layers of each model architecture. Third, to encourage mutual enhancement of various models, knowledge distillation in local training is further applied to improve the overall performance. To make our framework workable in real systems, we implement it in both centralized settings by server-coordinated parallel training and decentralized settings by a newly developed device-to-device model training-forwarding scheme. Extensive experiments on benchmark datasets are conducted to verify the potential of our approach.
Submission Number: 1
Loading