Sliding Focal Loss for Class Imbalance Classification in Federated XGBoost

Published: 01 Jan 2022, Last Modified: 13 May 2025ISPA/BDCloud/SocialCom/SustainCom 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: As a very popular framework, federated learning can help heterogeneous participants cooperate training global models without the local data being exposed. It not only takes advantage of massive raw data, but also fundamentally protects the privacy of participants. An unavoidable challenge is that class imbalance brought by many participants will seriously affect the model performance and even damage the convergence. Introducing Focal loss to dynamically adjust the weight of samples in the training process is a good choice for relieving this issue. In our experiments, we find a trade-off between the convergence and the final accuracy of using focal loss and cross-entropy in traditional federated XGBoost. For this property, we propose hyperparametric linear and exponential sliding to combine the advantages of both methods. A complete experiment proved that sliding rather than direct cohesion was necessary. Meanwhile, linear sliding performs well than just using focal loss in three of the four class imbalances scenarios, and exponential sliding performs the best in all four scenarios. There are even two that exceed the cross-entropy with a finite number of communication rounds.
Loading