Abstract: Highlights•We propose a new federated SNN training framework (HDSFL) that can reduce communication costs by about 1–2 orders of magnitude without declining precision.•We design a distillation loss function that considers the knowledge distillation technique of the hint-layer.•We propose a new federated knowledge aggregation strategy based on the confidence of each client.•We design a spike tensor compression strategy for spike features in this paper.
Loading