Eligibility traces provide a data-inspired alternative to backpropagation through timeDownload PDF

Published: 02 Oct 2019, Last Modified: 05 May 2023Real Neurons & Hidden Units @ NeurIPS 2019 OralReaders: Everyone
TL;DR: We present eligibility propagation an alternative to BPTT that is compatible with experimental data on synaptic plasticity and competes with BPTT on machine learning benchmarks.
Keywords: neuroscience, plausible learning rules, spiking neurons, BPTT, recurrent neural networks, LSTM, RNN, computational neuroscience, backpropagation through time, online learning, real-time recurrent learning, RTRL, eligibility traces
Abstract: Learning in recurrent neural networks (RNNs) is most often implemented by gradient descent using backpropagation through time (BPTT), but BPTT does not model accurately how the brain learns. Instead, many experimental results on synaptic plasticity can be summarized as three-factor learning rules involving eligibility traces of the local neural activity and a third factor. We present here eligibility propagation (e-prop), a new factorization of the loss gradients in RNNs that fits the framework of three factor learning rules when derived for biophysical spiking neuron models. When tested on the TIMIT speech recognition benchmark, it is competitive with BPTT both for training artificial LSTM networks and spiking RNNs. Further analysis suggests that the diversity of learning signals and the consideration of slow internal neural dynamics are decisive to the learning efficiency of e-prop.
4 Replies

Loading