Keywords: spiking neural networks, knowledge distillation, convolutional neural networks (CNN), Image classification
Abstract: Spiking Neural Network (SNN) is a kind of brain-inspired and event-driven network, which is becoming a promising energy-efficient alternative to Artificial Neural Networks (ANNs).
However, the performance of SNNs by direct training is far from satisfactory.
Inspired by the idea of Teacher–Student Learning, in this paper, we study a novel learning method named $\textit{SuperSNN}$, which utilizes the ANN model to guide the SNN model learning.
$\textit{SuperSNN}$ leverages knowledge distillation to learn comprehensive supervisory information from pre-trained ANN models, rather than solely from labeled data.
Unlike previous work that naively matches SNN and ANN's features without deeply considering the precision mismatch, we propose an indirect relation-based approach, which defines a pairwise-relational loss function and unifies the value scale of ANN and SNN representation vectors, to alleviate the unexpected precision loss.
This allows the knowledge of teacher ANNs can be effectively utilized to train student SNNs.
The experimental results on three image datasets demonstrate that no matter whether homogeneous or heterogeneous teacher ANNs are used, our proposed $\textit{SuperSNN}$ can significantly improve the learning of student SNNs with only two time steps.
Supplementary Material: zip
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 2279
Loading