Abstract: Federated Learning (FL) is a novel distributed learning paradigm which can coordinate multiple clients to jointly train a machine learning model by using their local data samples. Existing FL works can be roughly divided into two categories according to the modes of model training: Parallel FL (PFL) and Sequential FL (SFL). PFL can speed up each round of model training time through parallel training, but it might suffer from the convergence degradation when facing the heterogeneity issue. SFL can deal with the heterogeneity issue well to reduce the number of training rounds, but it will spend more time in each round of local model training due to the sequential mode. In this paper, we propose a novel hybrid Parallel-Sequential Federated Learning (PSFL) framework by integrating the parallel and sequence training modes together. We derive the upper bounds of the model convergence and the expected total training time for the PSFL framework through theoretical analysis. Based on the results, we find out the optimal training structure and design a client sampling strategy, which can balance the two training modes and guarantee the unbiasedness. Extensive experiments validate our theoretical analysis and demonstrate the significant performance of the PSFL framework.
External IDs:dblp:conf/infocom/ZhouZ0X0025
Loading