Hierarchical Global Asynchronous Federated Learning Across Multi-Center

Published: 05 Sept 2024, Last Modified: 16 Oct 2024ACML 2024 Conference TrackEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Asynchronous Federated Learning, heterogeneity, multi-regional centers, hierarchical framework
Verify Author List: I have double-checked the author list and understand that additions and removals will not be allowed after the submission deadline.
Abstract: Federated learning for training machine learning models across geographically distributed regional centers is becoming prevalent. However, because of disparities in location, latency, and computational capabilities, synchronously aggregating models across different sites requires waiting for stragglers, leading to significant delays. Traditional asynchronous aggregation across regional centers still faces issues of stale model parameters and outdated gradients due to the hierarchical aggregation involving local clients within each center. To address this, we propose Hierarchical Global Asynchronous Federated Learning (HGA-FL), which combines global asynchronous model aggregation across regional centers with synchronous aggregation and local consistent regularization alignment within each local center. We theoretically analyze the convergence rate of our method under non-convex optimization settings, demonstrating its stable convergence during the aggregation. Experimental evaluations show that our approach outperforms other baseline two-level aggregation methods in terms of global model generalization ability, particularly under conditions of data heterogeneity, latency, and gradient staleness.
A Signed Permission To Publish Form In Pdf: pdf
Supplementary Material: pdf
Primary Area: General Machine Learning (active learning, bayesian machine learning, clustering, imitation learning, learning to rank, meta-learning, multi-objective learning, multiple instance learning, multi-task learning, neuro-symbolic methods, etc.)
Paper Checklist Guidelines: I certify that all co-authors of this work have read and commit to adhering to the guidelines in Call for Papers.
Student Author: Yes
Submission Number: 160
Loading