Conditional Moment Alignment for Improved Generalization in Federated LearningDownload PDF

Published: 21 Oct 2022, Last Modified: 05 May 2023FL-NeurIPS 2022 OralReaders: Everyone
Keywords: Federated Learning, Model Heterogeneity, Hardware Heterogeneity, Domain Adaptation, Distributed Learning, Privacy, Generalization
TL;DR: We propose a Federated conditional moment alignment algorithm in the model heterogeneous federated learning setup and derive the convergence and generalization properties of the algorithm.
Abstract: In this work, we study model heterogeneous Federated Learning (FL) for classification where different clients have different model architectures. Unlike existing works on model heterogeneity, we neither require access to a public dataset nor do we impose constraints on the model architecture of clients and ensure that the clients' models and data are private. We prove a generalization result, that provides fundamental insights into the role of the representations in FL and propose a theoretically grounded algorithm Federated Conditional Moment Alignment (FedCMA) that aligns class conditional distributions of each client in the feature space. We prove the convergence and empirically, we show that \pap outperforms other baselines on CIFAR-10, MNIST, EMNIST, FEMNIST in the considered setting.
Is Student: Yes
4 Replies

Loading