Abstract: In Federated Learning (FL), multiple clients collaborate to learn a shared model through
a central server while keeping data decentralized. Personalized Federated Learning (PFL)
further extends FL by learning a personalized model per client. In both FL and PFL,
all clients participate in the training process and their labeled data are used for training.
However, in reality, novel clients may wish to join a prediction service after it has been
deployed, obtaining predictions for their own unlabeled data.
Here, we introduce a new learning setup, On-Demand Unlabeled PFL (OD-PFL), where a
system trained on a set of clients, needs to be later applied to novel unlabeled clients at
inference time. We propose a novel approach to this problem, ODPFL-HN, which learns
to produce a new model for the late-to-the-party client. Specifically, we train an encoder
network that learns a representation for a client given its unlabeled data. That client
representation is fed to a hypernetwork that generates a personalized model for that client.
Evaluated on five benchmark datasets, we find that ODPFL-HN generalizes better than the
current FL and PFL methods, especially when the novel client has a large shift from training
clients. We also analyzed the generalization error for novel clients, and showed analytically
and experimentally how novel clients can apply differential privacy to protect their data.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Gang_Niu1
Submission Number: 1054
Loading