Synaptic Diversity in ANNs Can Facilitate Faster LearningDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Abstract: Various advancements in artificial neural networks (ANNs) are inspired by biological concepts, e.g., the artificial neuron, an efficient model of biological nerve cells demonstrating learning capabilities on large amounts of data. More recent inspirations with promising results are advanced regularization techniques, e.g., synaptic scaling, and backpropagation alternatives, e.g., Targetprop. While neurosciences continuously discover and better understand the mechanisms of biological neural networks (BNNs), new opportunities for a transfer of these concepts towards ANNs arise. However, only few concepts are readily applicable and improvements for ANNs are far from being guaranteed. In this paper, we focus on the inhomogeneous and dynamically changing structures of BNNs in contrast to mostly homogeneous and fix topologies of ANNs. More specifically, we transfer concepts of synaptic diversity, namely spontaneous synaptic remodeling, diversity in synaptic plasticity and multi-synaptic connectivity to ANNs. We observe ANNs enhanced by synaptic diversity concepts to learn faster, to predict with higher accuracy and to be more resilient to gradient inversion attacks. Our proposed methods are easily applicable to existing ANN topologies and are therefore supposed to stimulate an adaptation of and further research into these mechanisms.
One-sentence Summary: Taking BNN-like layers to state-of-the-art network architectures.
5 Replies

Loading