everyone
since 13 Oct 2023">EveryoneRevisionsBibTeX
In the practical implementation of federated learning (FL), a major challenge arises from the presence of diverse and heterogeneous edge devices in real-world scenarios, each equipped with varying computational resources. The conventional FL approaches, operating under the assumption of uniform model capacity, face a dilemma. They can opt for a large global model, but this may not be feasible on resource-constrained devices, resulting in issues of fairness and training biases. Conversely, they can choose a small global model, but this compromises its ability to represent complex patterns due to limited capacity. In this paper, we present a novel approach called Dynamic Federated Learning (DynamicFL). It employs structural re-parameterization to achieve adaptable local model modulation and seamless knowledge transfer across a diverse set of heterogeneous models. DynamicFL ensures equitable treatment of all clients, empowering them to actively participate in the learning process with their full computational potential, thereby fostering sustainability within the FL ecosystem. Extensive experimental results validate that DynamicFL surpasses state-of-the-art techniques, including knowledge distillation and network pruning-based methods, in achieving significantly higher test accuracy in the context of heterogeneous FL.