Abstract: The Spiking Neural Networks (SNNs), renowned for their bio-inspired operational mechanism and energy efficiency, mirror the human brain’s neural activity. Yet, SNNs face challenges in balancing energy efficiency with the computational demands of advanced tasks. Our research introduces the RTFormer, a novel architecture that embeds Re-parameterized Temporal Sliding Batch Normalization (TSBN) within the Spiking Transformer framework. This innovation optimizes energy usage during inference while ensuring robust computational performance. The crux of RTFormer lies in its integration of reparameterized convolutions and TSBN, achieving an equilibrium between computational prowess and energy conservation. Our experimental results highlight its effectiveness, with RTFormer achieving notable accuracy on standard datasets like ImageNet (80.54%), CIFAR-10 (96.27%), and CIFAR-100 (81.37%), and excelling in neuromorphic datasets such as CIFAR10-DVS (83.6%) and DVS128 (98.61%). These achievements illustrate RTFormer’s versatility and establish its potential in the realm of energy-efficient neural computing.
Loading