Keywords: Bias, Heterogeneity, Client Selection
TL;DR: We present a client selection scheduler for federated learning training that optimizes for training time while making sure slow users also get a chance to contribute to global model thus mitigating any socioeconomic bias.
Abstract: Federated learning provides the ability to learn over heterogeneous user data in a distributed manner while preserving user privacy. However, its current client selection technique is a source of bias as it discriminates against slow clients. For starters, it selects clients that satisfy certain network and system-specific criteria, thus not selecting slow clients. Even when such clients are included in the training process, they either struggle with the training or are dropped altogether for being too slow. Our proposed idea looks to find a sweet spot between fast convergence and heterogeneity by looking at smart client selection and scheduling techniques.
0 Replies
Loading