FedPGRL: Prototype-Guided Refinement for Federated Learning with Long-Tailed Noisy Clients
Keywords: Robust Federated Learning, Federated Long-Tail Learning, Federated Noisy Label Learning, Computer Vision, Deep Learning.
Abstract: Federated Learning (FL) is a promising framework for privacy-preserving collaborative learning across distributed clients. However, real-world FL applications, especially in large-scale manufacturing quality inspection, face challenges from the combined effects of noisy labels and long-tailed data. These issues arise due to inconsistent human annotation and the rarity of certain defects. Most existing methods tackle either noisy labels or long-tail distributions in isolation, but few address both together, leading to biased models and poor performance. To tackle this, we propose {\bf Federated Learning with Prototype-Guided Refinement Learning (FedPGRL)}, a unified framework built on the CLIP2FL architecture. FedPGRL uses class prototypes to filter and refine training samples through three stages: Prototypical Distance Selection (PDS), Prototypes Contrast Refinement (PCR), and Multi Prototypical Mask Refinement (MPMR). These stages work jointly to generate soft labels and denoise the training data, improving robustness to both label noise and class imbalance. We also introduce Class Prototypical Regularization (CPR) to enhance inter-class separation and intra-class compactness during training. Extensive experiments on CIFAR-10, CIFAR-100, and Tiny-ImageNet, under both simulated and real-world FL settings, show that FedPGRL consistently outperforms state-of-the-art methods in classification accuracy and stability.
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: true
Submission Guidelines: true
Anonymous Url: true
No Acknowledgement Section: true
Submission Number: 14369
Loading