DoB-SNN: A New Neuron Assembly-Inspired Spiking Neural Network for Pattern ClassificationDownload PDFOpen Website

Published: 2021, Last Modified: 12 May 2023IJCNN 2021Readers: Everyone
Abstract: Spiking neural networks (SNNs) as the third generation of artificial neural networks are closer to their biological counterparts than their predecessors. SNNs have a higher computational capacity and lower power requirements than networks of sigmoidal neurons. In this paper, a new spiking neural network for pattern classification referred to as Degree of Belonging SNN (DoB-SNN) is introduced. DoB-SNN is inspired by a neuronal assembly where each neuron has a degree of belonging to every class of data being process. DoB-SNN clusters the neurons during the training process using DoBs to allocate a group of neurons to each class. A new training algorithm is presented to adjust DoBs along with the network's synaptic weights, based on Spike-Timing Dependent Plasticity (STDP) and neurons' activity for training samples. The performance of DoB-SNN is evaluated on five datasets from the UCI machine learning repository. Nested Cross-Validation is employed to determine the network's hyperparameters for each dataset and thoroughly assess generalisation capability. A detailed comparison on these datasets with three other supervised learning algorithms, including SpikeProp, SWAT, and SRESN is provided. The results show that no algorithm significantly outperforms DoB-SNN, Whereas DoB-SNN has significantly better performance than others for Liver disorders dataset (>6.10%, <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$p &lt; 0.01$</tex> ). Accuracies obtained by DoB-SNN are significantly greater than SWAT for both Iris and Breast Cancer (>1.69%, <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$p &lt; 0.001$</tex> ) and significantly better than SpikeProp for Iris (1.62%, <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$p=0.04$</tex> ). In all comparisons, DoB-SNN used the smallest network, among others. DoB-SNN therefore offers significant potential as alternative SNN architecture and learning algorithm.
0 Replies

Loading