Dual Propagation: Accelerating Contrastive Hebbian Learning with Dyadic Neurons

Published: 16 Jun 2023, Last Modified: 17 Jul 2023ICML LLW 2023EveryoneRevisionsBibTeX
Keywords: Contrastive Hebbian learning, Energy-based models, Lifted neural networks, Biological plausibility, Neuromorphic computing
TL;DR: We present a contrastive Hebbian learning algorithm, built on compartmental neurons, which matches backprop in terms of runtime and performance.
Abstract: Activity difference based learning algorithms---such as contrastive Hebbian learning and equilibrium propagation---have been proposed as biologically plausible alternatives to error back-propagation. However, on traditional digital chips these algorithms suffer from having to solve a costly inference problem twice, making these approaches more than two orders of magnitude slower than back-propagation. In the analog realm equilibrium propagation may be promising for fast and energy efficient learning, but states still need to be inferred and stored twice. Inspired by lifted neural networks and compartmental neuron models we propose a simple energy based compartmental neuron model, termed dual propagation, in which each neuron is a dyad with two intrinsic states. At inference time these intrinsic states encode the error/activity duality through their difference and their mean respectively. The advantage of this method is that only a single inference phase is needed and that inference can be solved in layerwise closed-form. Experimentally we show on common computer vision datasets, including Imagenet32x32, that dual propagation performs equivalently to back-propagation both in terms of accuracy and runtime.
Submission Number: 15
Loading