FedFews: Robust Federated Knowledge Distillation with Prototypes for Few-Shot Learning

Published: 2025, Last Modified: 07 Jan 2026ICIC (9) 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Federated learning has been employed to enhance model performance while protecting data privacy, but in few-shot scenarios, data scarcity and task asymmetry often cause distributional shifts. To address these challenges, we propose a few-shot federated learning approach called FedFews, which enables the balance between local and global knowledge learning and achieving topology heterogeneity compatibility under limited data traffic. In FedFews, each client has a local metric classifier and a classifier modification module. The former is based on a non-parametric knowledge distillation paradigm, which transforms locally cached prototypes into a few-shot teacher model and distills its knowledge. The other one uses a global prototype dictionary for supervised learning to rectify the classifier of local model. Evaluation results demonstrate that FedFews achieves promising performance compared to baseline algorithms in topology heterogeneous scenarios. Furthermore, even with a 50% signal loss rate, the improvement rate remains at 92.5% of that achieved with lossless communication.
Loading