Distributionally Robust Federated Learning with Wasserstein Barycenter

Published: 19 Mar 2024, Last Modified: 19 Mar 2024Tiny Papers @ ICLR 2024 ArchiveEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Wasserstein barycenter, Distributionally Robust Optimization, Federated Learning
Abstract: Federated Learning (FL) has emerged as a privacy-preserving approach for collaboratively training models without sharing raw data, while a key challenge is that the data across the clients may not be identically distributed. The nominal distribution that the model truly learns is commonly assumed as the Euclidean barycenter. In this paper, we propose $\textbf{Fed}$erated $\textbf{D}$istributionally $\textbf{R}$obust $\textbf{O}$ptimization ($\texttt{FedDRO}$) that constructs the Wasserstein barycenter among all distributions with a Wassertein ball as an ambiguity set. We reformulate this paradigm as a min-max optimization problem that trains a robust FL model in an adversarial way and analyze its generalization and optimization properties.
Supplementary Material: pdf
Submission Number: 20
Loading