Efficient Logit-based Knowledge Distillation of Deep Spiking Neural Networks for Full-Range Timestep Deployment

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: Considering the spatiotemporal characteristics of Spiking Neural Networks, we optimize logit-based knowledge distillation in deep SNNs training. The method is neat and effective, and we provide both theoretical and empirical analysis as evidence.
Abstract: Spiking Neural Networks (SNNs) are emerging as a brain-inspired alternative to traditional Artificial Neural Networks (ANNs), prized for their potential energy efficiency on neuromorphic hardware. Despite this, SNNs often suffer from accuracy degradation compared to ANNs and face deployment challenges due to fixed inference timesteps, which require retraining for adjustments, limiting operational flexibility. To address these issues, our work considers the spatio-temporal property inherent in SNNs, and proposes a novel distillation framework for deep SNNs that optimizes performance across full-range timesteps without specific retraining, enhancing both efficacy and deployment adaptability. We provide both theoretical analysis and empirical validations to illustrate that training guarantees the convergence of all implicit models across full-range timesteps. Experimental results on CIFAR-10, CIFAR-100, CIFAR10-DVS, and ImageNet demonstrate state-of-the-art performance among distillation-based SNNs training methods. Our code is available at https://github.com/Intelli-Chip-Lab/snn_temporal_decoupling_distillation.
Lay Summary: Considering the spatiotemporal characteristics of Spiking Neural Networks, we optimize logit-based knowledge distillation in deep SNNs training. The method is neat and effective, and we provide both theoretical and empirical analysis as evidence.
Link To Code: https://github.com/Intelli-Chip-Lab/snn_temporal_decoupling_distillation
Primary Area: Deep Learning->Everything Else
Keywords: Spiking Neural Networks, Knowledge Distillation, Image Classification, Backpropagation
Submission Number: 10867
Loading