Federated Learning under Label Shifts with Guarantees

23 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: general machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: density ratio estimation; label shifts; discrepancy measures; generalization error
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: We consider the problem of training a global model in a distributed setting and develop an unbiased estimate of the overall *true risk* minimizer of multiple clients under challenging inter-client and intra-client *label shifts* as a stepping stone to provably address distribution shifts in real world. We generalize the family of Maximum Likelihood Label Shift (MLLS) density estimation methods inspired by a board family of Integral Probability Metrics and introduce the Variational Regularized Label Shift (VRLS) family of density ratio estimation methods and show all MLLS methods are special cases of VRLS under specific latent spaces. Our theory shows high-probability estimation error bounds achieved through a versatile regularization term in VRLS. Our extensive numerical experiments demonstrate that VRLS establishes *a new SotA in density ratio estimation* surpassing all baselines in MNIST, Fashion MNIST, CIFAR-10 datasets and *relaxed label shifts* as a proxy of real-world settings. In distributed settings, our importance-weighted empirical risk minimization with VRLS outperforms federated averaging and other baselines in imbalanced settings under drastic and challenging label shifts.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 7235
Loading