SEM: A Simple Yet Efficient Model-agnostic Local Training Mechanism to Tackle Data Sparsity and Scarcity in Federated Learning
Abstract: Recent years have witnessed the emergence of Feder-ated Learning (FL) as a viable learning paradigm that permits the training of models without revealing sensitive data. FL systems typically consist of numerous clients that utilize their data to train models locally and an orchestration server responsible for combining local updates from the clients to produce a global model. Thus, the performance of a Federated learning system is highly dependent on the client data and the local model trained on these data. In this study, we present an early attempt at addressing the sparsity and scarcity of client data, which may lead to the overfitting phenomenon of local models and substantially reduce the overall accuracy of the global model. Specifically, we propose a novel local training strategy that explores transfer learning and allows each local model to be trained using the data of two randomly paired clients. The proposed method is orthogonal to other Federated Learning algorithms and can be integrated into most Federated Learning systems. Extensive experiments in various settings on MNIST, CIFAR-10, and CIFAR-100 datasets showed that using our proposed method can relatively enhance the accuracy of the global model by up to 12.48%. Our work, for the first time, offers a simple yet effective solution that reduces the undesired effects of data sparsity and scarcity in FL.
Loading