STFormer : Spatial Temporal Spiking Transformer

18 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: spiking neural network, temporal-spatial information
TL;DR: Our work achieves state-of-the-art image classification using innovative spatial-temporal handling.
Abstract: Spiking Neural Networks (SNNs) are a type of neural network that is modelled based on the biological processes that occur within the brain. Neurons in SNNs send brief spikes of signals rather than continuous signals like neurons in traditional Artificial Neural Networks (ANNs). Due to the brain-like low energy consumption, SNNs have become prominent in the neural network community. But owing to their intricate and non-linear nature, the formation and training of SNNs pose difficulties. To better training and presenting, nowadays, SNNs are integrated with the STBP training method and attention mechanism. However, prior studies neglected the properties of the spike data, leading to an unbridgeable divide between SNNs and ANNs. To optimize the utilization of spatio-temporal characteristics in spike data, we have introduced the Temporal Core to extract temporal features and the Spatial Core to enlarge the receptive field. The data underwent evaluation from both neuromorphic and non-neuromorphic datasets and achieved state-of-the-art results. Specifically, we attained an accuracy of 96.35\% for CIFAR10 and 81.39\% for CIFAR100. On the neuronmorphic dataset, we reached an accuracy of 83.1\% for CIFAR10-DVS and 98.61\% for DVS-Gesture. Technical abbreviations are always explained when first used. The code will be made available at xxx.
Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 1027
Loading