HeteroSFL: Split Federated Learning with heterogeneous clients and non-IID data

22 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Split Federated Learning, Communication Heterogeneity, Non-IID data
Abstract: Split federated learning (SFL) is an emerging privacy-preserving decentralized learning scheme, which splits a machine learning model so that most of the computations are offloaded to the server. While SFL is edge-friendly, it has high communication cost and so existing SFL schemes focus on reducing the communication cost of homogeneous client-based systems. However a more realistic scenario is when clients are heterogeneous, i.e. they have different system capabilities including computing power and communication data rates. We focus on the heterogeneity due to different data rates since in SFL the computation in the client-end is quite small. In this paper, we propose HeteroSFL, the first SFL framework with heterogeneous clients that handles non-IID data with label distribution skew across clients and across groups of clients. HeteroSFL compresses data with different compression factors in low-end and high-end group using narrow and wide bottleneck layers (BL), respectively. It provides a mechanism to address the challenge of aggregating different-sized BL models, and utilizes bidirectional knowledge sharing (BDKS) to address the overfitting caused by the different label distributions across high- and low-end groups. We show that HeteroSFL achieves significant training time reduction with minimum accuracy loss compared to competing methods. Specifically, it can reduce the training time of SFL by 16× to 256× with 1.24% to 5.59% accuracy loss for VGG11 on CIFAR10 for non-IID data.
Supplementary Material: zip
Primary Area: general machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6298
Loading