Abstract: Spiking neural networks (SNNs) have been promising applications in the image recognition domain, and their key component is the spiking neuron. SNN s mainly contain integration and firing processes, which are essentially weight accumulation and threshold comparison, respectively. However, spike trains of the neurons exhibit high sparsity and irregularity in both temporal and spatial domains, leading to inefficient memory access and computation. Therefore, designing an efficient accelerator for SNNs is urgent. This paper presents an elaborate accelerator Early in a software-hardware co-design way. At the software level: (i) Noticing the importance of weights, where larger weights disproportionately affect the membrane potential, we devise a weight importance-aware early firing solution for the firing neurons. It prioritizes the accumulation of these large weights, thereby accelerating the membrane potential's rise to surpass the threshold sooner. (ii) Meanwhile, given the observation that a large proportion of neurons do not eventually be fired even after experiencing a long delay of weight accumulation, we propose a weight importance-aware early exit mechanism. It preferentially accumulates large weights and compares the membrane potential with the predetermined threshold, which early halts the accumulation of neurons that are unlikely to be fired, enhancing efficiency. At the hardware level, we design a specialized processing element (PE) featuring the reorder engine for spikes and weights, tailored to realize the aforementioned strategies. Experimental results show that Early averagely achieves 20.3 x, 6.5 x, and 2.4 x speedup compared to the state-of-the-art accelerators Spinalflow, PTB, and SATO. Meanwhile, it averagely achieves 25.2x, 7.4x, and 3.2x energy savings with respect to the three accelerators.
Loading