Abstract: Federated Learning (FL) is a distributed machine learning (ML) approach that allows multiple devices to train a model while maintaining training data privately on edge devices. Enhancing model generalizability necessitates an increase in information for training that is achievable by involving a greater number of edge devices in the training process. In the existing frameworks, training parameters like communication rounds (CR), aggregation algorithm, etc. are preserved as constant despite an increase in the number of clients. This leads to excessive and unavoidable carbon emissions (CE). Most lack a module to estimate CE during training. Balancing high performance, lowering the CR, and increasing energy efficiency seems challenging. In this paper, we crack the code by introducing a system to categorize clients according to their individual characteristics followed by their grouping. We introduce Fed2Tier, an open-source FL system, addressing these limitations. The introduction of intermediate nodes in our system has effectively helped in handling clients with degree of characteristics viz. variations in underlying data distributions. It has concurrently reduced both CR and the CE while training at an increased privacy. Source codes along with documentation and tutorials are available at: https://github.com/apoorvakliv/fed2tier.
Loading