Federated Generative Learning with Foundation Models

15 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: general machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: federated learning, non-iid data
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We propose a novel federated learning framework, namely Federated Generative Learning, that transmits prompts associated with distributed training data between clients and server.
Abstract: Existing federated learning solutions focus on transmitting features, parameters, or gradients between clients and server, which suffer from serious low-efficiency and privacy-leakage problems. Thanks to the emerging foundation generative models, we propose a novel federated learning framework, namely Federated Generative Learning. In this framework, each client can create text prompts that are tailored to their local data, based on its features, and then send them to the server. Given the received prompts, the informative training data can be remotely synthesized on the server using foundation generative models. This new framework offers several advantages, including enhanced communication efficiency, improved resilience to distribution shift, significant performance gains, and enhanced privacy protection. We validate these benefits through extensive experiments conducted on ImageNet and DomainNet datasets, e.g., on ImageNet100 dataset, with a highly skewed data distribution, our method outperforms FedAvg by 12% in a single communication round. Moreover, our approach only requires 229 Bytes prompts for communication, while FedAvg necessitates the transmission of 42.7 MB parameters.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 1
Loading