Keywords: Personalized federated learning, Meta learning, Over-the-air computations
Abstract: Federated learning (FL) is an emerging approach in machine learning that enables large-scale distributed model training without sharing local private data. However, training a generic global model often fails to meet the personalized demands of all clients over heterogeneous networks. Gradient-based meta-learning, especially MAML, has become a viable solution for this objective. One issue with MAML is the computational and memory burden introduced by the second-order information needed to compute the meta-gradient. Additionally, frequent communication between clients and server for model update in FL systems presents a significant communication bottleneck in wireless networks.
In this paper, we propose a novel personalized federated meta-learning system that leverages only first-order information and utilizes over-the-air computations to improve communication efficiency. We prove the convergence of our algorithm under non-convex conditions and demonstrate its effectiveness through extensive numerical experiments.
Submission Number: 9
Loading