Deep Spiking Neural Network with Brain-Inspired Recurrent Iterative Learning

ICLR 2026 Conference Submission17580 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Spiking neural networks
Abstract: Spiking neural networks (SNNs) have emerged as a transformative paradigm in artificial intelligence, offering event-driven computation and exceptional energy efficiency. However, conventional SNN training methods predominantly rely on backpropagation with surrogate gradients, often neglecting biologically plausible mechanisms such as spike-timing-dependent computations and dynamic excitation-inhibition balance—key features that underpin the brain’s remarkable efficiency and adaptability. To bridge this gap, we propose Brain-Inspired Recurrent Iterative Learning (BIRIL), a novel hybrid learning framework that synergistically integrates biologically realistic spike transmission with adaptive excitation-inhibition dynamics. BIRIL not only emulates the temporal precision of biological neurons but also dynamically modulates neuronal activity to enhance learning efficiency. Extensive experiments on benchmark datasets—including CIFAR-10, CIFAR-100, MNIST, and DVS128 Gesture—demonstrate that BIRIL outperforms state-of-the-art SNN models, achieving superior accuracy while maintaining low computational overhead. Our work provides a principled approach to advancing neuromorphic learning, paving the way for more brain-like and energy-efficient AI systems.
Primary Area: applications to neuroscience & cognitive science
Submission Number: 17580
Loading