Temporal Flexibility in Spiking Neural Networks: A Novel Training Method for Enhanced Generalization Across Time Steps

22 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: optimization
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Spiking Neural Networks, Direct Training
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: Spiking Neural Networks (SNNs), models inspired by neural mechanisms in the brain, allow for an energy-efficient implementation on neuromorphic hardware. However, the limitation of current direct training approaches lies in their ability to only optimize parameters for an SNN operating at a specific time step. This leads to the necessity for fine-tuning when generalizing to additional time steps, resulting in considerable computational inefficiency. In this study, we initially examine the feasibility of parameter sharing across structurally identical SNNs operating at different time steps. Subsequently, we propose an innovative training methodology-mixed time step training (MTT) that facilitates the development of a temporal flexible SNN (TFSNN). Throughout the training process, various time steps are arbitrarily assigned to distinct SNN blocks, accompanied by the establishment of novel inter-block communication protocols. Following training, the TFSNN can be simplified to an SNN operating at any chosen fixed time step, eliminating the need for fine-tuning. Experimental results across all primary datasets demonstrate that the TFSNN exhibits robust generalization capabilities surpassing existing training methodologies reliant on a fixed time step. Notably, we achieved a 96.84% accuracy rate on the CIFAR10 dataset, an 81.98% accuracy rate on the CIFAR100 dataset, and a 68.34% accuracy rate on the ImageNet dataset with T = 6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5117
Loading