Federated Oriented Learning: A Practical One-Shot Personalized Federated Learning Framework

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: We introduce Federated Oriented Learning (FOL), a one-shot personalized federated learning framework that improves local models under communication constraints through model alignment, ensemble refinement, and knowledge distillation.
Abstract: Personalized Federated Learning (PFL) has become a promising learning paradigm, enabling the training of high-quality personalized models through multiple communication rounds between clients and a central server. However, directly applying traditional PFL in real-world environments where communication is expensive, limited, or infeasible is challenging, as seen in Low Earth Orbit (LEO) satellite constellations, which face severe communication constraints due to their high mobility, limited contact windows. To address these issues, we introduce Federated Oriented Learning (FOL), a novel four-stage one-shot PFL algorithm designed to enhance local model performance by leveraging neighboring models within stringent communication constraints. FOL comprises model pretraining, model collection, model alignment (via fine-tuning, pruning, post fine-tuning, and ensemble refinement), and knowledge distillation stages. We establish two theoretical guarantees on empirical risk discrepancy between student and teacher models and the convergence of the distillation process. Extensive experiments on datasets Wildfire, Hurricane, CIFAR-10, CIFAR-100, and SVHN demonstrate that FOL consistently outperforms state-of-the-art one-shot Federated Learning (OFL) methods; for example, it achieves accuracy improvements of up to 39.24\% over the baselines on the Wildfire dataset.
Lay Summary: When many devices such as Low-Earth-Orbit (LEO) satellites, delivery drones, or Internet of Things (IoT) sensors with intermittent connectivity collect data in different places, they could learn better if they shared what they know. Unfortunately, these devices often can only connect with one another briefly and have very limited communication windows, so traditional federated-learning algorithms (which send large models back and forth many times) are impractical. We introduce Federated Oriented Learning (FOL), a one-shot approach: each device exchanges models with its neighbors only once yet still ends up with a fully personalized model. For every model it receives, FOL (i) fine-tunes it on local data, (ii) prunes away irrelevant parameters, (iii) post-fine-tunes it again, and (iv) merges the best-matched models into a compact yet powerful teacher. The device then distills knowledge from this teacher into its own student model, while keeping it the same size and structure as the original local model. We prove that this distillation step converges and that the student’s error remains close to the teacher’s. On real wildfire and hurricane satellite imagery, as well as on standard image benchmarks, FOL outperforms existing one-shot approaches by as much as 39 percentage points, showing that personalized models can be learned effectively even under severe communication constraints.
Application-Driven Machine Learning: This submission is on Application-Driven Machine Learning.
Link To Code: https://app.box.com/s/phf6bhjy6owcr6b1rvfe412fiw059pxk
Primary Area: Applications
Keywords: Low Earth Orbit Satellites, Model Personalization, One-Shot Federated Learning, Communication Constraints
Submission Number: 12166
Loading