From Noise to Semantics: Bit Operations Decouple Spiking Neural Networks for Entropy Optimization

16 Sept 2025 (modified: 23 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: spiking neural network; neuromorphic recognition; semantic decoupling
TL;DR: Efficiently decoupling spike semantics and noise for optimal representation.
Abstract: Although the binary spike transmission mechanism enables spiking neural networks (SNNs) to consume ultra-low power, SNNs have always struggled with suboptimal performance. This paper points out that SNN spike maps suffer from significant spike noise, which impairs object semantics and results in limited performance. To mitigate spike noise interference, we explored minimal bit operations for spike map decoupling. Specially, we show that the AND operation can ingeniously extract stable object semantics across timesteps, while the XOR operation separates out unstable spike noise. By approximating the original spike map through this decoupling, we propose an information bottleneck-based entropy optimization strategy that explicitly minimizes the conditional entropy of the object semantics while maximizing that of the spike noise. This dual entropy optimization strategy allows SNNs to ignore noise interference and learn the optimal semantic representation. To ensure efficiency, we replace the entire forward-backward propagation with a lightweight classifier to estimate conditional entropy, thereby introducing minimal training overhead. Extensive experiments have shown that our method significantly improves the performance of SNNs and offers superior generalization ability. In particular, our method can be seamlessly combined with others for flexible timestep inference and ultra-low latency early exit with a single training. This provides new insights into the efficient decoupling and optimization of SNNs.
Primary Area: learning theory
Submission Number: 6440
Loading