Toward Personalized Federated Meta-Learning With Constrained Hypernetwork on Non-IID Data

Published: 2026, Last Modified: 27 Jan 2026IEEE Trans. Computers 2026EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Personalized Federated Learning (pFL) tailors models to each client’s local data distribution in heterogeneous federated learning settings. Federated Meta-Learning (FML) is a branch of pFL that uses meta-learning to achieve fast adaptation, where clients start with a meta-model and personalize it by fine-tuning it with local data. Since a single global meta-model has limitations when the data distribution of clients varies significantly, meta-model personalization should be considered in FML. However, most benchmark pFL methods lack meta-model personalization, and usually lack meta-learning or relying on a single global meta-model. Besides, these methods can neither provide meta-model personalization nor guarantee generalization and convergence, due to the challenges in measuring the distance between the meta-model and the client model in FML. To address these issues, we combine FML with hypernetwork and propose a constrained hypernetwork-based FML framework called FMLH, which innovatively utilizes hypernetwork to capture the differences in fine-tuned models, thereby providing personalized meta-models for each client. We provide rigorous mathematical proofs illustrating how the hypernetwork affects the convergence and generalization bounds of FMLH. Experimental results demonstrate that FMLH significantly improves the generalization of the model in cross-client shifts, with the lowest decile accuracy improved by up to 18.71%. FMLH also outperforms representative pFL algorithms by up to 5.6% in terms of maximum accuracy improvement.
Loading