TL;DR: Dynamic Temperature Design for Federated Heterogeneous Logits Distillation.
Abstract: Federated Distillation (FedKD) relies on lightweight knowledge carriers like logits for efficient client-server communication.
Although logit-based methods have demonstrated promise in addressing statistical and architectural heterogeneity in federated learning (FL), current approaches remain constrained by suboptimal temperature calibration during knowledge fusion.
To address these limitations, we propose ReT-FHD, a framework featuring: 1) Multi-level Elastic Temperature, which dynamically adjusts distillation intensities across model layers, achieving optimized knowledge transfer between heterogeneous local models; 2) Category-Aware Global Temperature Scaling that implements class-specific temperature calibration based on confidence distributions in global logits, enabling personalized distillation policies; 3) Z-Score Guard, a blockchain-verified validation mechanism mitigating 44\% of label-flipping and model poisoning attacks. Evaluations across diverse benchmarks with varying model/data heterogeneity demonstrate that the ReT-FHD achieves significant accuracy improvements over baseline methods while substantially reducing communication costs compared to existing approaches. Our work establishes that properly calibrated logits can serve as self-sufficient carriers for building scalable and secure heterogeneous FL systems.
Lay Summary: A core challenge in federated learning (FL) is ensuring model logits remain effective and secure across devices with heterogeneous data or architectures. We address this by introducing dynamic temperature scaling to adapt logits to device-specific variations and a blockchain verification mechanism, demonstrating that calibrated logits—without auxiliary parameters—act as self-sufficient representations for scalable, secure FL systems, bridging logit-centric optimization with practical heterogeneity.
Primary Area: Social Aspects->Privacy
Keywords: Heterogeneous Federated Learning, Logits Distillation
Submission Number: 600
Loading