Parameter-oriented contrastive schema and multi-level knowledge distillation for heterogeneous federated learning

Published: 01 Jan 2025, Last Modified: 03 Aug 2025Inf. Fusion 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•Devise a parameter-oriented contrastive schema to rectify the update directions of local clients.•Propose a multi-level knowledge distillation to transfer the global model’s knowledge.•Provide the theoretical analysis in terms of model generalization and convergence.
Loading