Abstract: Binary Spiking Neural Networks (BSNNs) offer promising efficiency advantages for resource-constrained computing. However, their training algorithms often require substantial memory overhead due to latent weights storage and temporal processing requirements. To address this issue, we propose Binary Spiking Online (BSO) optimization algorithm, a novel online training algorithm that significantly reduces training memory. BSO directly updates weights through flip signals under the online training framework. These signals are triggered when the product of gradient momentum and weights exceeds a threshold, eliminating the need for latent weights during training. To enhance performance, we propose T-BSO, a temporal-aware variant that leverages the inherent temporal dynamics of BSNNs by capturing gradient information across time steps for adaptive threshold adjustment. Theoretical analysis establishes convergence guarantees for both BSO and T-BSO, with formal regret bounds characterizing their convergence rates. Extensive experiments demonstrate that both BSO and T-BSO achieve superior optimization performance compared to existing training methods for BSNNs. The codes are available at \url{https://github.com/hamingsi/BSO}.
Lay Summary: How can we create truly efficient brain-like computers that use simple binary switches and learn from timing patterns?
Scientists have been developing neural networks that combine two powerful ideas: processing information through brief spikes (like real neurons firing) and using only binary connections that are either fully "on" (+1) or completely "off" (-1). This combination promises revolutionary efficiency—imagine a computer that thinks in rapid spikes using only simple switches. But there's been a fundamental contradiction: training these spike-and-switch networks required secretly maintaining precise decimal backup copies of every connection.
Our breakthrough makes both the spikes and switches genuinely efficient. Our Binary Spiking Online (BSO) method eliminates all hidden decimal weights, treating connections as true binary switches that flip based on accumulated feedback momentum. Our advanced T-BSO version adds a crucial insight: it tracks how learning signals change over time, automatically adjusting sensitivity when spike patterns are weak versus when they're chaotic. This temporal awareness—understanding that brain-like networks naturally have different learning rhythms.
Link To Code: https://github.com/hamingsi/BSO
Primary Area: Applications->Neuroscience, Cognitive Science
Keywords: Spiking Neural Network, Online Learning, Spiking Binarization
Submission Number: 6324
Loading