FedSDP: Federated Self-Derived Prototypes for Personalized Federated Learning

Published: 2025, Last Modified: 21 Jan 2026ICDE 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Federated learning (FL) is a privacy-preserving machine learning algorithm that enables multiple clients to collaborate. To respond to non-independent and identically distributed (non-IID) environments between clients, personalized FL (PFL) has been actively investigated. The typical PFL model consists of two parts: 1) the head (i.e., classifier) for the final classification and 2) the body (i.e., feature extractor) for extracting representations from local datasets. The head is maintained separately in each client for personalization; the body is aggregated for generalization. FedSDP introduces a bridge layer, called a personalized layer, between the head and the body to preserve individual, non-shared local prototypes for each client. A personalized layer decouples the body and head, strengthening the generalization and personalization, respectively. Based on this architecture, this study proposes a new PFL framework, Federated Self-Derived Prototypes (FedSDP), to dynamically balance personalization and generalization. To this end, we introduce two dynamic adjustments for generating self-derived prototypes: 1) global-local similarity weight (GL-Sim Weight) and 2) personalization early stopping indicator (P-Stop Indicator). GL-Sim Weight based on the similarity between the global and local prototypes is utilized to adjust the degree of personalization of each local model. PStop Indicator is calculated based on the changed degree of local parameters in each client, determining the early stopping for personalization in the client and further concentrating on generalization. Our comprehensive experiments demonstrate that FedSDP outperforms existing state-of-the-art FL frameworks, showing superior effectiveness in non-IID settings. Our code and data are available at https://github.com/bigbases/FedSDP.
Loading