Abstract: Federated Learning (FL) is a distributed machine learning framework that efficiently reduces communication and preserves privacy. Existing FL algorithms typically rely on the assumption of direct communication between the server and clients for model data exchange. However, this assumption does not apply in many real-world scenarios where appropriate communication infrastructure is lacking, such as in remote smart sensing. To overcome this challenge, we propose a new framework, FedEx (Federated Learning via Model Express Delivery). FedEx employs mobile transporters, such as Unmanned Aerial Vehicles (UAVs), to establish indirect communication channels between the server and clients. We have developed two algorithms under this framework: FedEx-Sync and FedEx-Async, which differ based on whether the transporters operate on a synchronized or asynchronized schedule. Although indirect communication introduces variable delays in global model dissemination and local model collection, we demonstrate the convergence of both FedEx versions. Additionally, we explore the energy consumption of transporters, integrating it with the convergence bounds and proposing a bi-level optimization algorithm for efficient client assignment and route planning. Our experiments, conducted on two public datasets in a simulated environment, further demonstrate the efficacy of FedEx.
Loading