Abstract: Spiking Neural Networks (SNNs) aim to mimic the spiking behavior of biological neurons and are expected to play a key role in neural computing and artificial intelligence. Converting Artificial Neural Networks (ANNs) to SNNs is a widely used approach to achieve comparable performance on large-scale datasets, with efficiency determined by acitivation encoding. Current schemes, which typically rely on spike count or timing, exhibit a linear relationship between encoding precision and the number of required timesteps. To enhance encoding capacity with reduced timesteps, we propose the Canonic Signed Spike (CSS) coding scheme. Spikes are assigned different weights during the neuron's decoding stage, maintaining a single-bit spike representation. We analyze the residual errors during encoding and introduce the Over-Fire-and-Correct (OFC) method to enable efficient computation with weighted spikes. The optimal threshold derived from our method can also be applied to integrate-and-fire (IF) neurons and improve accuracy in rate coding. We evaluate the proposed methods on the CIFAR-10 and ImageNet datasets. The experimental results demonstrate that the CSS coding scheme significantly compresses timesteps with minimal conversion loss and offers an energy efficiency advantage for the resulting SNNs.
Primary Area: Applications->Neuroscience, Cognitive Science
Keywords: Spiking Neural Networks, spike encoding scheme, ANN-SNN conversion
Submission Number: 6944
Loading