Do Global and Local Perform Cooperatively or Adversarially in Heterogeneous Federated Learning?

Published: 11 Feb 2025, Last Modified: 06 Mar 2025CPAL 2025 (Proceedings Track) PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: federated learning; multilevel optimization; learning dynamics
Abstract: Heterogeneous federated learning (Hetero-FL) is an emerging machine learning framework that enables the training of collaborative models between devices with varying capabilities and data without sharing raw data. In HFL, there are two types of trainer that exhibit distinct behaviors: the Global Trainer (GTr), which prioritizes average performance while lacking fine-grained client insights; the Local Trainer (LTr), which addresses local issues and excels in local data, but struggles with generalization. Thus, it is crucial to combine them, obtaining an admired GTr. Unlike the prevalent personalization strategies that supplement GTr with LTr, our work introduces a novel approach in which GTr and LTr collaborate adversarially. The adversarial performance of the local trainer can unexpectedly enhance the overall performance of GTr in the combined global-local training process. Building on a profound understanding of this adversarial cooperation, we propose an alternating training strategy named Fed A(dversarial) B(ased) (C)ooperation (FedABC), utilizing a "G-L-G-L" framework. LTr increases the global loss, preventing GTr from falling at local minimum points. Our comprehensive experiments show superior accuracy, up to 13.77\%, and faster convergence than existing state-of-the-art Hetero-FL methods. We validate the effectiveness and efficiency of our approach in terms of fairness, generalizability, and long-term behavior. Ultimately, our proposed method underscores the design of the training strategy of the Hetero-FL model, emphasizing adversarial cooperation between GTr and LTr in real-world scenarios.
Submission Number: 58
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview