UniSpike: Boosting the Performance of Spiking Neural Network with Hybrid Training

17 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Spiking Nerual Network, Hybrid Training
Abstract: Spiking neural networks (SNNs) are increasingly studied for their brain-inspired computing paradigm, offering high efficiency and sparse activation. However, achieving high accuracy with a small time-step remains challenging for SNNs. In this paper, we propose UniSpike, a hybrid training framework that combines the high-accuracy feature of ANN-to-SNN conversion algorithms and the ultra-low time-step inference feature of direct training algorithms. UniSpike converts a quantized ANN into its SNN counterpart, then fine-tunes the converted SNN. To replace the SNN-unfriendly operators in the ANN, UniSpike proposes Unified-Clip. Unified-Clip is equivalent to spike neurons and can replace SNN-unfriendly operators (i.e., softmax, layernorm, and GeLU) without degrading ANN accuracy. With Unified-Clip, UniSpike proposes UniFormer, a novel transformer that is addition-only and supports step-by-step inference. UniFormer allows all the matrix multiplications except the patch embedding to be realized by simple addition and eliminates synchronization operators across time-steps. With UniFormer, UniSpike achieves 80.9% accuracy on ImageNet, outperforming the SOTA direct training algorithm Spike-Driven TransformerV2 (80.0%) with addition-only and step-by-step features with 4 time-steps. Compared to the SOTA conversion-based algorithm SpikeZIP-TF, UnSpike reduces 5.7$\times$ energy with comparable accuracy.
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 9608
Loading