HASFL: Harnessing Heterogeneous Models Across Diverse Devices for Enhanced Federated Learning

Published: 01 Jan 2024, Last Modified: 05 Mar 2025ICPP 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Recent advancements in federated learning have shown promising results in resource-constrained edge environments. However, with mobile devices becoming more capable of collecting data, individual client models are unable to utilize the available data due to their devices’ limited support for complex model training. Conversely, non-portable devices possess substantial computational resources, but the data they autonomously collect is insufficient to support the training of complex models. In this paper, we introduce HASFL, a novel split federated learning (SFL) framework that supports model structure heterogeneity across devices and decouples computation from the model. Through circular group training, HASFL enables mobile devices to utilize complex models to train their own data while ensuring that non-portable devices harness the data collected by mobile users. HASFL effectively addresses the challenges of applying advanced machine learning models in resource-constrained environments, leveraging the collective power of distributed devices without compromising data security. We implemented a circular group allocation method using the online algorithm to ensure cooperative training among heterogeneous models within each group while minimizing training time. In addition, we have conducted experiments to evaluate the performance of HASFL on various datasets and model architectures and analyzed the communication overhead of HASFL. The experimental results demonstrate that HASFL supports the training of heterogeneous models and significantly enhances the model’s accuracy with a relatively small increase in communication overhead.
Loading