First-order Personalized Federated Meta-Learning via Over-the-Air Computations

24 Nov 2024 (modified: 29 Dec 2024)AAAI 2025 Workshop AI4WCN SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Personalized federated learning, Meta learning, Over-the-air computations
Abstract: Federated learning (FL) is an emerging approach in machine learning that enables large-scale distributed model training without sharing local private data. However, training a generic global model often fails to meet the personalized demands of all clients over heterogeneous networks. Gradient-based meta-learning, especially MAML, has become a viable solution for this objective. One issue with MAML is the computational and memory burden introduced by the second-order information needed to compute the meta-gradient. Additionally, frequent communication between clients and server for model update in FL systems presents a significant communication bottleneck in wireless networks. In this paper, we propose a novel personalized federated meta-learning system that leverages only first-order information and utilizes over-the-air computations to improve communication efficiency. We prove the convergence of our algorithm under non-convex conditions and demonstrate its effectiveness through extensive numerical experiments.
Submission Number: 9
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview