A A 22nm 0.43pJ/SOP Sparsity-Aware In-Memory Neuromorphic Computing System with Hybrid Spiking and Artificial Neural Network and Configurable Topology

Published: 01 Jan 2023, Last Modified: 13 May 2025CICC 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Spiking neural networks (SNNs) dynamically process complex spatio temporal information as asynchronous and highly sparse spikes with high energy efficiency (EE). However, the training algorithms for nondifferentiable and discrete SNNs are still immature, leading to relatively low accuracy [1]. For instance, abnormal ECG detection is realized by SNN in [2] with 0. 53pJ/SOP EE, but the accuracy is only 90.5%. in [3], the on-chip learning of recurrent SNN for 1 -word keyword spotting (KWS) achieved only 90.7% accuracy. in contrast, artificial neural networks (ANNs) can reach excellent accuracy through gradient-based backpropagation (BP) training but require substantial energy consumption due to their intensive computations and memory accesses. A unified ANN-SNN architecture was proposed in [4] for high accuracy, but it sacrifices EE due to massive data movement and lack of sparsity utilization in SNN.
Loading