FedBEns: One-Shot Federated Learning based on Bayesian Ensemble

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: This paper proposed a novel approach to the One-Shot Federated Learning, based on Bayesian Inference
Abstract: One-Shot Federated Learning (FL) is a recent paradigm that enables multiple clients to cooperatively learn a global model in a single round of communication with a central server. In this paper, we analyze the One-Shot FL problem through the lens of Bayesian inference and propose FedBEns, an algorithm that leverages the inherent multimodality of local loss functions to find better global models.Our algorithm leverages a mixture of Laplace approximations for the clients' local posteriors, which the server then aggregates to infer the global model. We conduct extensive experiments on various datasets, demonstrating that the proposed method outperforms competing baselines that typically rely on unimodal approximations of the local losses.
Lay Summary: In traditional machine learning, data is sent to a central server to train a model. A different approach is Federated Learning, where multiple clients (like different institutions or different devices) work together to train a model collaboratively without sharing their data, ensuring user privacy. This study introduces FedBEns, an improved method for these clients to combine their knowledge and train a global model in a single communication round with the central server. FedBEns uses a technique based on Bayesian inference to better capture the unique learning patterns of each client. Tests on various datasets show that FedBEns produces more accurate models than existing methods.
Link To Code: https://github.com/jacopot96/FedBEns
Primary Area: Probabilistic Methods->Bayesian Models and Methods
Keywords: One-shot Federated Learning, Bayesian Inference, Laplace Approximation
Submission Number: 6630
Loading