Improving Performance in Neural Networks by Dendrites-Activated ConnectionsDownload PDFOpen Website

2023 (modified: 24 Jan 2023)CoRR 2023Readers: Everyone
Abstract: Computational units in artificial neural networks compute a linear combination of their inputs, and then apply a nonlinear filter, often a ReLU shifted by some bias, and if the inputs come themselves from other units, they were already filtered with their own biases. In a layer, multiple units share the same inputs, and each input was filtered with a unique bias, resulting in output values being based on shared input biases rather than individual optimal ones. To mitigate this issue, we introduce DAC, a new computational unit based on preactivation and multiple biases, where input signals undergo independent nonlinear filtering before the linear combination. We provide a Keras implementation and report its computational efficiency. We test DAC convolutions in ResNet architectures on CIFAR-10, CIFAR-100, Imagenette, and Imagewoof, and achieve performance improvements of up to 1.73%. We exhibit examples where DAC is more efficient than its standard counterpart as a function approximator, and we prove a universal representation theorem.
0 Replies

Loading