On the Generalization Bounds of Spiking Neural Networks via Rademacher Complexity

17 Sept 2025 (modified: 25 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Spiking Neural Networks, Generalization, Rademacher Complexity, Covering Number
TL;DR: In this paper, we theoretically propose the generalization of SNNs with general spiking neurons via Rademacher complexity.
Abstract: Spiking Neural Network (SNN) has garnered increasing attention as one of bio-inspired models due to its great potential in neuromorphic computing and sparse computation. Many algorithms and techniques have been developed; however, theoretical understandings of the generalization, that is, the extent to which SNNs perform well on unseen data, are far from clear. Recently, Zhang et al. (2024) disclosed that the generalization of SNNs with stochastic firing mechanisms can be upper bounded by an exponential function relative to the excitation probability. In this paper, we theoretically investigate the generalization of SNNs with common-used integration-and-fire schemes. We propose the generalization bounds for several LIF expressions via the empirical Rademacher complexity and covering number. Our theoretical results may shed some insight into future studies of SNNs.
Supplementary Material: pdf
Primary Area: learning theory
Submission Number: 8768
Loading