FedAVO: Improving Communication Efficiency in Federated Learning with African Vultures Optimizer

Published: 01 Jan 2024, Last Modified: 12 Nov 2025COMPSAC 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Federated Learning (FL) has recently experienced tremendous popularity due to its emphasis on user data privacy. However, the distributed computations of FL can result in constrained communication and drawn-out learning processes, necessitating the client-server communication cost optimization. The ratio of chosen clients and the quantity of local training passes are two hyperparameters that have a significant impact on the performance of FL. Due to different training preferences across various applications, it can be difficult for FL practitioners to manually select such hyperparameters. In this paper, we introduce FedAVO, a novel FL algorithm that enhances communication effectiveness by selecting the best hyperparameters leveraging the African Vulture Optimizer (AVO). Our research demonstrates that the communication costs associated with FL operations can be substantially reduced by adopting AVO for FL hyperparameter adjustment. Through extensive evaluations of FedAVO on benchmark datasets, we identify the optimal hyperparameters that are appropriately fitted for the benchmark datasets, eventually increasing global model accuracy by 6% in comparison to the state-of-the-art FL algorithms (such as FedAvg, FedProx, FedPSO). The code, data, and experiments have been made publicly available on our GitHub repository11https://github.com/speedlab-git/FedAVO.
Loading