Proper Backward Connection Placement Boosts Spiking Neural Networks

22 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Supplementary Material: zip
Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Spiking Neural Networks, Temporal Backward Connections, Neural Architecture Search
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: Our paper showcases the benefits derived from the implementation of bespoke backconnections in a spiking neural network, yielding significant enhancements in SNN performance while incurring minimal additional inference costs.
Abstract: We study how backward connections (BCs, also known as temporal feedback connections) impact the performance of Spiking Neural Networks (SNNs) and how to boost SNNs' performance. Presumably, BCs have the potential to enhance SNNs' representation capacity by creating new temporal pathways in the SNN. We find that BC can enhance SNNS 'representation by creating new temporal pathways within SNN. The specific form is that the back neuron has a guiding effect on the expression of the front neuron. It can be found that the expression ability of SNN was enhanced in almost all BCS at different locations.Based on the above analysis, we propose a backward connected neuron (BC-SNN) framework. The expression ability of SNN was further improved by selecting appropriate bc within and between blocks. Extensive experiments indicate that BC-SNN is able to achieve state-of-the-art results on the CIFAR10, CIFAR100, and Tiny-ImageNet datasets, with accuracy of 95.67\%, 78.59\%, and 63.43\%, respectively. A set of ablation studies are further presented to understand the efficacy of each design component in BC-SNN.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5207
Loading