SNIB: Improving Spike-Based Machine Learning Using Nonlinear Information Bottleneck

Published: 01 Jan 2023, Last Modified: 01 Oct 2024IEEE Trans. Syst. Man Cybern. Syst. 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Spiking neural networks (SNNs) have garnered increased attention in the field of artificial general intelligence (AGI) research due to their low power consumption, high computational efficiency, and low latency induced by their event-driven and sparse communication features. However, efficiently and robustly training an SNN presents a challenge. In this study, we introduce a novel framework for spike-based machine learning called spike-based nonlinear information bottleneck (SNIB). This framework utilizes an information-theoretic learning (ITL) approach and a surrogate gradient learning (SGL) method to achieve robust, accurate, and low-power performance. The proposed SNIB framework includes three variants: 1) squared information bottleneck (SIB); 2) cubic information bottleneck (CIB); and 3) quartic information bottleneck (QIB) strategies, which use a mapping mechanism to compress spiking representations. We systematically evaluate these strategies using different types of input noise and neuromorphic hardware noise. Our experimental results demonstrate that all three strategies effectively enhance the robustness of SGL in SNN architectures. Furthermore, SNIB can significantly reduce the power consumption of SNNs. As a result, SNIB offers a new and significant perspective for hardware-constrained general mobile devices for embedded edge intelligence and represents a progressive step toward realizing AGI.
Loading