Keywords: Federated Learning, Variance Reduction, Social Systems
TL;DR: FedCVR-Bolt, a federated learning algorithm that reduces update variance by clustering clients based on asymptotic agreement and selecting one representative per cluster.
Abstract: Federated Learning (FL) enables privacy-preserving collaborative model training, but its effectiveness is often limited by client data heterogeneity.
We introduce a client-selection algorithm that (i) dynamically forms non-overlapping coalitions of clients based on asymptotic agreement and (ii) selects one representative from each coalition to minimize the variance of model updates. Our approach is inspired by social-network modeling, leveraging homophily-based proximity matrices for spectral clustering and techniques for identifying the most the most informative individuals to estimate a group’s aggregate opinion. We provide theoretical convergence guarantees for the algorithm under mild, standard FL assumptions.
Finally, we validate our approach by benchmarking it against three strong heterogeneity-aware baselines; the results show higher accuracy and faster convergence, indicating that the framework is both theoretically grounded and effective in practice.
Primary Area: learning theory
Submission Number: 20033
Loading