Knowledge Distill for Spiking Neural Networks

Published: 01 Jan 2024, Last Modified: 19 Jan 2025IJCNN 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Spiking Neural Network (SNN) is a kind of braininspired and event-driven network, which is becoming a promising energy-efficient alternative to Artificial Neural Networks (ANNs). However, the performance of SNNs by direct training is far from satisfactory. Inspired by the idea of Teacher–Student Learning, in this paper, we study a novel learning method named SuperSNN, which utilizes the ANN model to guide the SNN model learning. SuperSNN leverages knowledge distillation to learn comprehensive supervisory information from pre-trained ANN models, rather than solely from labeled data. Unlike previous work that naively matches SNN and ANN’s features without deeply considering the precision mismatch, we propose an indirect relation-based approach, which defines a pairwise-relational loss function and unifies the value scale of ANN and SNN representation vectors, to alleviate the unexpected precision loss. This allows the knowledge of teacher ANNs can be effectively utilized to train student SNNs. The experimental results on three image datasets demonstrate that no matter whether homogeneous or heterogeneous teacher ANNs are used, our proposed SuperSNN can significantly improve the learning of student SNNs with only two time steps.
Loading