On the Convergence of Hierarchical Federated Learning with Partial Worker Participation

Published: 26 Apr 2024, Last Modified: 15 Jul 2024UAI 2024 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Hierarchical Federated Learning, Partial Worker Participation, Convergence Analysis
TL;DR: This work propose a novel convergence analysis of hierarchical federated learning for both full and partial worker participation with non-i.i.d. data, non-convex objective function and stochastic gradient.
Abstract: Hierarchical federated learning (HFL) has emerged as the architecture of choice for multi-level communication networks, mainly because of its data privacy protection and low communication cost. However, existing studies on the convergence analysis for HFL are limited to the assumptions of full worker participation and/or i.i.d. datasets across workers, both of which rarely hold in practice. Motivated by this, we in this work propose a unified convergence analysis framework for HFL covering both full and partial worker participation with non-i.i.d. data, non-convex objective function and stochastic gradient. We correspondingly develop a three-sided learning rates algorithm to mitigate data divergences issue, thereby realizing better convergence performance. Our theoretical results provide key insights of why partial participation of HFL is beneficial in significantly reducing the data divergences compared to standard FL. Besides, the convergence analysis allows certain individualization for each cluster in HFL indicating that adjusting the worker sampling ratio and round period can improve the convergence behavior.
List Of Authors: Jiang, Xiaohan and Zhu, Hongbin
Latex Source Code: zip
Signed License Agreement: pdf
Code Url: https://github.com/cardistryj/HFL
Submission Number: 144
Loading