Differential Coding for Training-Free ANN-to-SNN Conversion

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Spiking Neural Networks (SNNs) exhibit significant potential due to their low energy consumption. Converting Artificial Neural Networks (ANNs) to SNNs is an efficient way to achieve high-performance SNNs. However, many conversion methods are based on rate coding, which requires numerous spikes and longer time-steps compared to directly trained SNNs, leading to increased energy consumption and latency. This article introduces differential coding for ANN-to-SNN conversion, a novel coding scheme that reduces spike counts and energy consumption by transmitting changes in rate information rather than rates directly, and explores its application across various layers. Additionally, the threshold iteration method is proposed to optimize thresholds based on activation distribution when converting Rectified Linear Units (ReLUs) to spiking neurons. Experimental results on various Convolutional Neural Networks (CNNs) and Transformers demonstrate that the proposed differential coding significantly improves accuracy while reducing energy consumption, particularly when combined with the threshold iteration method, achieving state-of-the-art performance. The source codes of the proposed method are available at https://github.com/h-z-h-cell/ANN-to-SNN-DCGS.
Lay Summary: We noticed that when converting traditional Artificial Neural Networks (ANNs) into energy-efficient Spiking Neural Networks (SNNs), most methods rely on rate coding—which requires many spikes and long time steps, leading to increased energy consumption and latency. To address this, we introduced differential coding, which transmits changes in information rather than direct rates to reduce spike counts and energy use. We also introduced a threshold iteration technique to better adapt ReLUs to spiking neurons based on their activation patterns. Our experiments demonstrated that combining these methods not only improves accuracy but also significantly reduces energy consumption, achieving state-of-the-art performance. This research paves the way for developing efficient, low-energy AI systems, showcasing potential for various applications and advancing more sustainable computing technologies.
Link To Code: https://github.com/h-z-h-cell/ANN-to-SNN-DCGS
Primary Area: Applications->Neuroscience, Cognitive Science
Keywords: Spiking Neural Networks, ANN-to-SNN Covnersion, Differential Coding, Threshold Iteration
Submission Number: 11985
Loading