Abstract: We address the problem of client selection for source-free domain adaptation in heterogeneous federated learning (FL). In this setting, a central server possesses an unlabeled target-domain dataset and aims to learn a model by leveraging locally trained models from a large pool of K non-IID clients. Crucially, only a small subset of clients have data distributions that meaningfully align with the server’s target domain, making effective client selection essential. However, due to strict privacy constraints, the server cannot access raw client data, client-side statistics, or labels for its own dataset—it can only evaluate the initially-trained client models on unlabeled target samples. To tackle this challenge, we propose Federated Sparse Consensus Matching (FedSCM), a principled, optimization-based method for label-free and data-free client selection. FedSCM selects clients whose predictions are both confident and mutually consistent by solving an entropy-regularized sparse optimization problem over client weights. We prove that FedSCM always yields a sparse solution, and under a novel Dirichlet-based expertise model, it identifies the correct subset of relevant clients with high probability, provided n ≥ O(logK) target samples. We further establish local and global convergence guarantees under mild conditions. Extensive experiments on CIFAR-10, CIFAR-100, and SVHN demonstrate that FedSCM consistently outperforms existing approaches to federated domain adaptation, while significantly reducing both communication and computation overhead. Our framework offers a general and theoretically grounded approach to selective model aggregation under extreme data heterogeneity and limited supervision.
Submission Type: Long submission (more than 12 pages of main content)
Changes Since Last Submission: Dear Action Editor,
Thank you for considering our work. We have addressed all previously raised issues in the manuscript and submitted a revised version accordingly. Please let us know if any further changes are required.
Thank you in advance.
Best regards,
The Authors
Assigned Action Editor: ~Konstantin_Mishchenko1
Submission Number: 8618
Loading