Keywords: Spiking Neural Networks, Self Connection, Continuous Dynamical Systems, Approximation Power, Computational Efficiency
TL;DR: This work theoretically investigates the approximation power and computational efficiency of spiking neural networks with self connections.
Abstract: Spiking neural networks have attracted increasing attention in recent years due to their potential of handling time-dependent data. Many algorithms and techniques have been developed; however, theoretical understandings of many aspects of spiking neural networks are far from clear. A recent work [Zhang and Zhou, 2021] disclosed that typical spiking neural networks could hardly work on spatio-temporal data due to their bifurcation dynamics and suggested that the self-connection structure has to be added. In this paper, we theoretically investigate the approximation ability and computational efficiency of spiking neural networks with self connections, and show that the self-connection structure enables spiking neural networks to approximate discrete dynamical systems using a polynomial number of parameters within polynomial time complexities. Our theoretical results may shed some insight for the future studies of spiking neural networks.
Supplementary Material: pdf
10 Replies
Loading