FedDGA: Federated Multitask Learning Based on Dynamic Guided Attention

Published: 01 Jan 2025, Last Modified: 11 Apr 2025IEEE Trans. Artif. Intell. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The proliferation of privacy-sensitive data has spurred the development of federated learning (FL), which is an important technology for state-of-the-art machine learning and responsible AI. However, most existing FL methods are constrained in their applicability and generalizability due to their narrow focus on specific tasks. This article presents a novel federated multitask learning (FMTL) framework that is capable of acquiring knowledge across multiple tasks. To address the challenges posed by non-IID data and task imbalance in FMTL, this study proposes a federated fusion strategy based on dynamic guided attention (FedDGA), which adaptively fine-tunes local models for multiple tasks with personalized attention. In addition, this article designed dynamic batch weight (DBW) to balance the task losses and improve the convergence speed. Extensive experiments were conducted on various datasets, tasks, and settings, and the proposed method was compared with state-of-the-art methods such as FedAvg, FedProx, and SCAFFOLD. The results show that our method achieves significant performance gains, with up to 11.1% increase in accuracy over the baselines.
Loading