Fed3+2p: Training different parts of neural network with two-phase strategy

26 Sept 2024 (modified: 26 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Federated Learning, Distributed Machine Learning, Neural Networks, Non-IID Data
TL;DR: We propose a new federated learning framework called Fed3+2p, which features a three-part split neural network architecture, a two-phase training strategy, and coordinators to manage client training.
Abstract: In federated learning, the non-identically distributed data affects both global and local performance, while clients with small data volumes may also suffer from overfitting issues. To address these challenges, we propose a federated learning framework called Fed3+2p. In Fed3+2p, we divide the client neural network into three parts: a feature extractor, a filter, and classification heads, and to train these parts, we present two types of coordinators to train client sets with a two-phase training strategy. In the first phase, each Type-A coordinator trains the feature extractor of partial clients, whose joint data distribution is similar to the global data distribution. In the second phase, each Type-B coordinator trains the filter and classification heads of partial clients, whose data distributions are similar to each other. We conduct empirical studies on three datasets: FMNIST and CIFAR-10/100, and the results show that Fed3+2p surpasses the state-of-the-art methods in both global and local performance across all tested datasets.
Supplementary Material: zip
Primary Area: other topics in machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6557
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview