Fed3R: Recursive Ridge Regression for Federated Learning with strong pre-trained models

Published: 28 Oct 2023, Last Modified: 21 Nov 2023FL@FM-NeurIPS’23 PosterEveryoneRevisionsBibTeX
Keywords: Federated Learning, Ridge Regression, Random Features, Statistical Heterogeneity, Client Drift, Destructive Interference, Pre-trained models
TL;DR: Recursive Ridge Regression enables fast convergence in Federated Learning with pre-trained models
Abstract: Federated Learning offers a powerful solution for training models on data that cannot be centrally stored due to privacy concerns. However, the existing paradigm suffers from high statistical heterogeneity across clients' data, resulting in client drift due to biased local solutions. This issue is particularly pronounced in the final classifier layer, severely impeding convergence speed during aggregation. To overcome these challenges, we introduce Federated Recursive Ridge Regression (Fed3R). This approach replaces the gradient-based classifier with a ridge regression-based classifier, computed in a closed form, ensuring client drift resilience and severely reducing convergence time and communication costs. The incremental formulation of Fed3R is equivalent to the ideal centralized ridge regression solution, enabling the utilization of more complex architectures with pre-trained parameters and robust generalization capabilities incompatible with previous federated learning techniques. We propose Fed3R in three variants, with Fed3R-RF significantly enhancing performance to levels akin to centralized training while remaining competitive regarding the total communication costs.
Student Author Indication: Yes
Submission Number: 55
Loading