Poster: Optimal Variance-Reduced Client Sampling for Multiple Models Federated Learning

Published: 01 Jan 2024, Last Modified: 21 May 2025ICDCS 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Federated learning (FL) is a variant of distributed learning in which multiple clients collaborate to learn a global model without sharing their data with the central server. In real-world scenarios, a client may be involved in training multiple unrelated FL models, which we call multi-model federated learning (MMFL), and the client sampling strategy and task allocation are crucial for improving system performance. In this paper, we propose an optimal sampling method to minimize the variance of global updates for unbiased learning in MMFL systems. The resulting method achieves an average accuracy of over 30 % higher than other baseline methods, as we demonstrate through simulations on real-world federated datasets.
Loading