Keywords: Spiking neural networks; Mixed-Timestep Spiking Neural Networks; ANN-SNN Conversion; Temporal Alignment
TL;DR: This paper proposes Mixed-Timestep Spiking Neural Networks (MT-SNNs) , develops a quantization conversion framework, optimizes parameters, solves temporal mismatch, and validates via experiments (73.63% ImageNet-1K accuracy with 4.88 steps).
Abstract: Spiking Neural Networks (SNNs) are intrinsically energy-efficient. However, most existing models enforce a uniform time-step across all layers, which limits flexibility and degrades performance under low-latency inference.
To address this limitation, we propose Mixed-Timestep Spiking Neural Networks (MT-SNNs), a paradigm in which each layer operates with an optimally selected time-step, thereby enabling the joint optimization of accuracy and latency.
Within MT-SNNs, we develop a quantization-aware conversion framework that maps a pre-trained ANN to a mixed-timestep SNN.
Specifically, we first establish an equivalence principle between activation bit-width and time-steps: the SNN time-step $T$ can be theoretically approximated by the activation quantization level $2^n$ in the source ANN. Based on this theory, different activation bit-widths of ANNs can be layer-wisely mapped to the corresponding time-steps of SNNs.
Then, we jointly optimize the quantization levels and firing thresholds to obtain an optimal parameter combination, where the overall $T$ is minimized. As a result, a optimized accuracy-latency trade-off is achieved.
Finally, we identify a temporal dimension mismatch issue in MT-SNNs and propose a temporal alignment scheme to address this issue, ensuring proper propagation of activations across layers.
Extensive experiments on CIFAR-10, ImageNet-1k, CIFAR10-DVS and DVS-Gesture demonstrate the effectiveness of our approach. On ImageNet-1K, our MT-SNNs achieve 73.63\% top-1 accuracy with only 4.88 time-steps, advancing the state of the art.
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 19238
Loading