FedUFD: Personalized Edge Computing Using Federated Uncertainty-Driven Feature Distillation

Published: 01 Jan 2025, Last Modified: 22 Jul 2025INFOCOM 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Recently, federated learning (FL) has been considered a promising and well-suited technique for edge computing applications, such as intelligent traffic control, autonomous driving, and mobile crowdsensing. However, since each edge device may perform individual-specific tasks, they often have heterogeneous data distributions that impact the performance of collaborative training models. Personalized FL (PFL) has then received considerable attention to tackle this problem. Many existing PFL works often employ knowledge distillation to mitigate the negative effects of data heterogeneity. Nevertheless, these works often neglect the fact that the knowledge transferred from the teacher models is not completely correct, which limits the personalization performance of edge devices. In this work, we leverage the knowledge contained in global features to explore the potential of global models and propose a novel uncertainty-driven feature distillation framework called FedUFD. Specifically, we design an uncertainty estimation module in local models, by estimating the uncertainty of the personalized feature distribution, FedUFD can measure the difficulty of learning different personalized features, and then combine the global features to distill the corresponding personalized features. Extensive experiments show that FedUFD outperforms fourteen state-of-the-art PFL frameworks in edge computing, beating the best-performing traditional and personalized baselines by up to 45.45% and 3.55%, respectively.
Loading