Parameter-oriented contrastive schema and multi-level knowledge distillation for heterogeneous federated learning
Abstract: Highlights•Devise a parameter-oriented contrastive schema to rectify the update directions of local clients.•Propose a multi-level knowledge distillation to transfer the global model’s knowledge.•Provide the theoretical analysis in terms of model generalization and convergence.
Loading