Noise-Robust Federated Learning With Model Heterogeneous Clients

Published: 01 Jan 2025, Last Modified: 15 May 2025IEEE Trans. Mob. Comput. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Federated Learning (FL) enables multiple devices to collaboratively train models without sharing their raw data. Considering that clients may prefer to design their own models independently, model heterogeneous FL has emerged. Additionally, due to the annotation uncertainty, the collected data usually contain unavoidable and varying noise, which cannot be effectively addressed by existing FL algorithms. This paper presents a novel solution that simultaneously handles model heterogeneity and label noise in a single framework. It is featured in three aspects: (1) For the communication between heterogeneous models, we directly align the model feedback by utilizing the easily-accessible public data, which does not require additional global models or relevant data for collaboration. (2) For internal label noise in each client, we design a dynamic label refinement strategy to mitigate the negative effects. (3) For challenging noisy feedback from other participants, we design an enhanced client confidence re-weighting scheme, which adaptively assigns corresponding weights to each client in the collaborative learning stage. Extensive experiments validate the effectiveness of our approach in mitigating the negative effects of various noise rates and types under both model homogeneous and heterogeneous FL settings.
Loading