TS-SNN: Temporal Shift Module for Spiking Neural Networks

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Spiking Neural Networks (SNNs) are increasingly recognized for their biological plausibility and energy efficiency, positioning them as strong alternatives to Artificial Neural Networks (ANNs) in neuromorphic computing applications. SNNs inherently process temporal information by leveraging the precise timing of spikes, but balancing temporal feature utilization with low energy consumption remains a challenge. In this work, we introduce Temporal Shift module for Spiking Neural Networks (TS-SNN), which incorporates a novel Temporal Shift (TS) module to integrate past, present, and future spike features within a single timestep via a simple yet effective shift operation. A residual combination method prevents information loss by integrating shifted and original features. The TS module is lightweight, requiring only one additional learnable parameter, and can be seamlessly integrated into existing architectures with minimal additional computational cost. TS-SNN achieves state-of-the-art performance on benchmarks like CIFAR-10 (96.72\%), CIFAR-100 (80.28\%), and ImageNet (70.61\%) with fewer timesteps, while maintaining low energy consumption. This work marks a significant step forward in developing efficient and accurate SNN architectures.
Lay Summary: Spiking Neural Networks (SNNs) are a type of brain-inspired computing system that mimics how neurons communicate through electrical pulses. They are energy-efficient and well-suited for tasks like sensor data, where timing matters. However, existing SNNs often struggle to balance accuracy and energy use when handling time-sensitive information. In this work, we developed a simple yet powerful module called the Temporal Shift Module (TS-SNN) to improve SNNs’ performance. Our module acts like a "time window," combining past, present, and future information within a single processing step. By adding just one adjustable parameter and using minimal extra computation, TS-SNN retains critical details while keeping energy costs low. Tests on image recognition benchmarks showed TS-SNN achieves top-tier accuracy—like 96.7% on CIFAR-10 and 70.6% on ImageNet—while using fewer processing steps than traditional methods. This breakthrough brings us closer to creating AI systems that are both highly accurate and energy-efficient, paving the way for smarter devices in applications like robotics or wearable tech. We hope this inspires further innovations in brain-like computing systems
Primary Area: Applications->Neuroscience, Cognitive Science
Keywords: Spiking Neural Networks, Neuromorphic Computing, Brain-inspired Learning
Submission Number: 3594
Loading