Federated learning for spiking neural networks by hint-layer knowledge distillation

Published: 01 Jan 2024, Last Modified: 05 Mar 2025Appl. Soft Comput. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•We propose a new federated SNN training framework (HDSFL) that can reduce communication costs by about 1–2 orders of magnitude without declining precision.•We design a distillation loss function that considers the knowledge distillation technique of the hint-layer.•We propose a new federated knowledge aggregation strategy based on the confidence of each client.•We design a spike tensor compression strategy for spike features in this paper.
Loading