Spatiotemporal Spiking Entropic Bottleneck: Data-efficient Learning with Joint Redundancy Reduction in Spiking Neural Networks
Keywords: Spiking Neural Networks, Brain-inspired Computing, Neuromorphic Intelligence
Abstract: Spiking neural networks (SNNs) are energy-efficient brain-inspired models, which have received increasing attention in recent years. However, existing SNNs tend to overlook more challenging scenarios with insufficient sample sizes. In data-scarce scenarios, the spatiotemporal dynamics in SNNs often involve joint spatiotemporal redundancy, which results in compromised generalization and reduced robustness. The information bottleneck principle has demonstrated powerful spatial compression in artificial neural networks, but its direct application to SNNs is nontrivial: the discrete, timing-dependent nature of spikes makes spatiotemporal entropy estimation inherently challenging. To reduce the joint redundancy for data-efficient learning, we propose the spatiotemporal spiking entropic bottleneck (STSEB) framework that jointly compresses spatial and temporal information while preserving task-relevant features. Central to STSEB is the spike time matrix, which records each neuron’s first spiking time to extract the most critical temporal feature, discard redundant spikes, and align activities across neurons. We further develop a spike-time-matrix-based Rényi’s
-entropy estimator that captures the intrinsic frequency distribution of spatiotemporal spiking patterns to drive compression under spatiotemporal bottleneck objective. We prove that STSEB obtains more compact latent representations than traditional information bottleneck by average spiking rate and total correlation metrics. The experimental results show that STSEB achieves superior generalization and robustness compared to SOTA under scarce samples, with higher sample efficiency and reduced power consumption. The code will be released upon acceptance.
Supplementary Material: zip
Primary Area: applications to neuroscience & cognitive science
Submission Number: 7425
Loading