Reproducibility Study of Spike-Train Level Backpropagation for Training Deep Recurrent Spiking Neural NetworksDownload PDF

29 Dec 2019 (modified: 05 May 2023)NeurIPS 2019 Reproducibility Challenge Blind ReportReaders: Everyone
Abstract: In this paper, we examine the findings presentedin the novel spike-train level backpropagation algorithm ST-RSBP, trained on different types of spiking neural networks(SNN). The ST-RSBP method improves upon existing spike trainlevel propagation algorithms by improving the accuracy of thedifferentiation of activation layers and by adding backprop-agation to recurrent neural connections. We analyze the ST-RSBP differentiation technique through an ablation study of thealgorithm, where we alter or remove parts of the released codeto examine their impact on the reported results. Through ouranalysis we conclude that the paper does make improvements tothe performance of their previous algorithm.
Track: Ablation
NeurIPS Paper Id: https://openreview.net/forum?id=Bkg6nVHlIH&noteId=Bkle07MfsH
5 Replies

Loading