Towards Zero Memory Footprint Spiking Neural Network Training

20 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: general machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Deep Neural Networks, Spiking Neural Networks, Reversible Layer
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Spiking Neural Networks (SNNs), as representative brain-inspired neural networks, emulate the intrinsic characteristics and functional principles of the biological brain. With their unique structure reflecting biological signal transmission through spikes, they have achieved significant success in processing temporal data. However, the training of SNNs demands a substantial memory footprint due to the added storage needs for spikes or events, resulting in intricate architectures and dynamic configurations. In this paper, to address memory constraints in SNN training, we introduce an innovative framework characterized by a remarkably low memory footprint. We \textbf{(i)} design a reversible SNN node that retains a high level of accuracy. Our design is able to achieve a $\mathbf{58.65\times}$ reduction in memory usage compared to the current SNN node. We \textbf{(ii)} propose a unique algorithm to streamline the backpropagation process of our reversible SNN node. This significantly trims the backward Floating Point Operations Per Second (FLOPs), thereby accelerating the training process in comparison to the current reversible layer backpropagation method. By using our algorithm, the training time is able to be curtailed by $\mathbf{23.8}$ % relative to existing reversible layer architectures.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 2819
Loading