Keywords: Personalized Federated Learning; Optimal Brain Damage; Model Decoupling
Abstract: Personalized Federated Learning (PFL) addresses the challenge of data heterogeneity across clients by adapting global knowledge to local data distributions. A promising approach within PFL is model decoupling, which separates the Federated Learning (FL) model into global and personalized parameters. Consequently, a key question in PFL is determining which parameters should be personalized to balance global knowledge sharing and local data adaptation. In this paper, we propose a parameter decoupling algorithm with a quantile-based thresholding mechanism and introduce an element-wise importance score, termed Federated Optimal Brain Personalization ({\tt FedOBP}). This score extends Optimal Brain Damage (OBD) pruning theory by incorporating a federated approximation of the first-order derivative in the Taylor expansion to evaluate the importance of each parameter for personalization. Extensive experiments demonstrate that {\tt FedOBP} outperforms state-of-the-art methods across diverse datasets and heterogeneity scenarios, while requiring personalization of only a very small number of personalized parameters.
Supplementary Material: zip
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 9428
Loading