FedDr+: Stabilizing Dot-regression with Global Feature Distillation for Federated Learning

Published: 26 Aug 2024, Last Modified: 26 Aug 2024FedKDD 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Federated learning, Dot-regression, Knowledge distillation
TL;DR: Our novel algorithm, FedDr+, enhances client model alignment using dot-regression loss and a feature distillation mechanism, thereby improving the performance of FL.
Abstract: Federated Learning (FL) has emerged as a pivotal framework for developing effective global models across clients with heterogeneous, non-iid data distribution. A key challenge in FL is client drift, where data heterogeneity impedes the aggregation of scattered knowledge. Recent studies have tackled client drift by identifying significant divergence in the last classifier layer. To mitigate this divergence, strategies such as freezing classifier weights and aligning the feature extractor accordingly have proven effective. However, while local alignment between classifier and feature extractor is crucial in FL, it may cause the model to overemphasize observed classes within each client. Our objective is twofold: (1) enhancing local alignment while (2) preserving the representation of unseen class samples. We introduce a novel algorithm named FedDr+, which enhances local model alignment using dot-regression loss. FedDr+ freezes the classifier as a simplex ETF to align features and improves aggregated global models through a feature distillation mechanism to retain information about unseen/missing classes. Empirical evidence demonstrates that our algorithm surpasses existing methods that use a frozen classifier to enhance alignment across diverse distributions.
Submission Number: 8
Loading