Co-Dream: Collaborative data synthesis with decentralized models

Published: 16 Jun 2023, Last Modified: 17 Jul 2023ICML LLW 2023EveryoneRevisionsBibTeX
Keywords: Deep learning, Federated learning, Collaborative inference, Data synthesis
Abstract: We present a framework for distributed optimization that addresses the decentralized and siloed nature of data in the real world. Existing works in Federated Learning address it by learning a centralized model from decentralized data. Our framework \textit{Co-Dream} instead focuses on learning the representation of data itself. By starting with random data and jointly synthesizing samples from distributed clients, we aim to create proxies that represent the global data distribution. Importantly, this collaborative synthesis is achieved using only local models, ensuring privacy comparable to sharing the model itself. The collaboration among clients is facilitated through federated optimization in the data space, leveraging shared input gradients based on local loss. This collaborative data synthesis offers various benefits over collaborative model learning, including lower dimensionality, parameter-independent communication, and adaptive optimization. We empirically validate the effectiveness of our framework and compare its performance with traditional federated learning approaches through benchmarking experiments.
Submission Number: 19
Loading