FAST: Federated Average with Snapshot Unleashes Arbitrary Client Participation

ICLR 2025 Conference Submission192 Authors

13 Sept 2024 (modified: 13 Oct 2024)ICLR 2025 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Federated Learning, Optimization
Abstract: Federated Learning (FL) provides a flexible distributed platform where numerous clients with high degrees of heterogeneity in data and system can collaborate to learn a model jointly. Previous research has shown that Federated Learning is effective in handling diverse data, but often assumes idealized conditions. Specifically, client participation is often simplified in these studies, while real-world factors make it difficult to predict or design individual client participation. This complexity often diverges from the ideal client participation assumption, rendering an unknown pattern of client participation, referred to as *arbitrary client participation*. Hence, it is an important open problem to explore the impact of client participation and find a lightweight mechanism to enable arbitrary client participation in FL. In this paper, we first empirically investigate the influence of client participation on FL, revealing that FL algorithms are significantly impacted by arbitrary client participation. Afterward, to alleviate the influence, we propose a lightweight solution, Federated Average with Snapshot (FAST), to unleash the almost arbitrary client participation for FL. It can seamlessly integrate with other classic FL algorithms. Specifically, FAST enforces the clients to take a snapshot once in a while and facilitates arbitrary client participation for the majority of the training process. We show the convergence rates of FAST in non-convex and strongly-convex cases, which could match the rates with those in ideal client participation. Furthermore, we empirically introduce an adaptive strategy for dynamically configuring the snapshot frequency, tailored to accommodate diverse FL systems. Our extensive numerical results demonstrate that our FAST algorithm attains significant improvements under the conditions of arbitrary client participation and highly heterogeneous data.
Supplementary Material: zip
Primary Area: optimization
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 192
Loading