Enhancing the Representational Power of Spiking Neural Networks via Diversified Spike Patterns

ICLR 2026 Conference Submission16808 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: spiking neural networks, deep learning
TL;DR: We analyze the shortcomings of direct coding and propose a novel neural coding scheme that improves the performance of SNNs by diversifying the encoded spike trains.
Abstract: One of the fundamental aspects of spiking neural networks (SNNs) is how they encode and process information through the generation of spikes, and direct coding is one of the most widely used coding schemes for its simplicity and promising performance. In this study, we examine the traits of the encoded spike trains under the direct coding scheme and reveal that the severe imbalance in the distribution of spike train patterns can pose a major obstacle to SNN performance. Based on our analyses, we propose diverse-pattern coding (DPC), a novel neural coding scheme that diversifies encoder output spike patterns through two technical components: temporal embedding and temporal feedback layer. The former incorporates information over time into the input, and the latter applies a recurrent layer for each timestep to deliver heterogeneous features to the input spiking neuron. Our extensive experimental results demonstrate that DPC improves SNN performance through diversified encoded spikes, achieving superior performance across multiple datasets and model architectures with minimal increase in memory costs.
Primary Area: applications to neuroscience & cognitive science
Submission Number: 16808
Loading