ACFFSL:a federated few-shot learning framework with contrastive learning and lightweight multi-scale attention

Published: 2025, Last Modified: 09 Jan 2026Clust. Comput. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Federated learning enables multiple parties to collaboratively train a model without sharing data. However, in real-world scenarios, each client typically has only a limited number of samples, which significantly degrades the performance of traditional federated learning methods. While meta-learning is a common approach to address few-shot problems, the non-independent and identically distributed nature of the data across clients poses additional challenges. To address these issues, we propose a novel federated few-shot learning framework that integrates contrastive learning, attention mechanisms, and knowledge distillation. Our framework first introduces a lightweight multi-scale attention mechanism to extract critical features representations from few-shot data. These features are then refined through contrastive learning, which enhances their separability by pulling together samples from the same class and pushing apart samples from different classes, thereby improving the synchronization between global and local models. Additionally, to handle the complexities introduced by Non-IID data, we employ decoupled knowledge distillation, which efficiently transfers essential knowledge from the global model to local clients, ensuring robust generalization across different distributions. Experimental results on the FC100 and miniImageNet datasets demonstrate that our proposed method achieves state-of-the-art accuracy, making it a promising solution for federated few-shot learning.
Loading