SPIRE: Conditional Personalization for Federated Diffusion Generative Models
TL;DR: We propose Shared Backbone Personal Identity Representation Embeddings (SPIRE), a framework that casts per‑client diffusion based generation as conditional generation in FL
Abstract: Two defining characteristics of federated learning (FL) client data are distributional heterogeneity and small local sample sizes. These properties necessitate data efficient, and client specific adaptation rather than a one-size-fits-all model. Recent advances in diffusion models have revolutionized generative AI. However, their scale is too large for straightforward fine-tuning; making personalization difficult. To enable personalized diffusion generative models, we propose Shared‑backbone Personal Identity Representation Embeddings (SPIRE), a framework that casts per‑client diffusion based generation as conditional generation in FL. SPIRE factorizes the network into (i) a high‑capacity global backbone that learns a population‑level score function and (ii) lightweight, learnable client embeddings that encode local data statistics. This separation enables parameter‑efficient fine‑tuning that touches $<0.01\%$ of weights. We provide the first theoretical bridge between conditional diffusion training and maximum‑likelihood estimation in Gaussian‑mixture models. For a two‑component mixture we prove that gradient descent on the DDPM with respect to mixing weights loss recovers the optimal mixing weights and enjoys dimension‑free error bounds. Our analysis also hints at how client embeddings act as biases that steer a shared score network toward personalized distributions. Empirically, SPIRE matches or surpasses strong baselines during collaborative pre‑training, and vastly outperforms them when adapting to unseen/new clients—reducing Kernel Inception Distance while updating only hundreds of parameters. SPIRE further mitigates catastrophic forgetting and remains robust across fine-tuning learning‑rate and epoch choices.
Submission Number: 700
Loading