Distilling SNN Students from ANN Teachers via Spiking Neural Architecture Search

ICLR 2026 Conference Submission18211 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Neural architecture search, spiking neural network, knowledge distillation
Abstract: Bridging the performance gap between Spiking Neural Networks (SNNs) and Artificial Neural Networks (ANNs) under low timesteps remains a critical challenge in the SNN community. Recent work uses either ANN-supervised training or automated architecture design to narrow the gap. However, the combination of ANN-supervised training and SNN architecture search remains unexplored, leaving room for further improvement of SNN performance. To address this, we propose DSAS, a training-free spiking neural architecture search method that leverages pre-trained ANN teachers to discover efficient, high-performance SNNs with few timesteps. Specifically, DSAS employs an evolutionary neural architecture search guided by two novel metrics, i.e., Multi-layer Activation Similarity (MAS) and Threshold-guided Gradient Similarity (TGS). MAS aligns ANN and SNN feature maps, yet TGS ensures gradient alignment while tuning spiking thresholds. Experiments demonstrate that DSAS achieves state-of-the-art accuracy with four timesteps, effectively narrowing the performance gap of ANN and SNN. For example, DSAS discovers architectures that achieve 66.00\% top-1 accuracy on Tiny-ImageNet and 81.97\% on CIFAR-100. Available code: https://anonymous.4open.science/r/DSAS-5764
Primary Area: applications to neuroscience & cognitive science
Submission Number: 18211
Loading