A Lagrangian Perspective on Dual Propagation

Published: 01 Nov 2023, Last Modified: 22 Dec 2023MLNCP OralEveryoneRevisionsBibTeX
Keywords: dual propagation, contrastive Hebbian learning, equilibrium propagation, biologically plausible learning, local learning
TL;DR: In this paper we present a Lagrangian based derivation of dual propagation, which makes dual propagation robust to asymmetric nudging.
Abstract: The search for "biologically plausible" learning algorithms has converged on the idea of representing gradients as activity differences. However, most approaches require a high degree of synchronization (distinct phases during learning) and introduce high computational overhead, which raises doubt regarding their biological plausibility as well as their potential usefulness for neuromorphic computing. Furthermore, they commonly rely on applying infinitesimal perturbations (nudges) to output units, which is impractical in noisy environments. Recently it has been shown that by modelling artificial neurons as dyads with two oppositely nudged compartments, it is possible for a fully local learning algorithm to bridge the performance gap to backpropagation, without requiring separate learning phases, while also being compatible with significant levels of nudging. However, the algorithm, called dual propagation, has the drawback that convergence of its inference method relies on symmetric nudging of the output units, which may be infeasible in biological and analog implementations. Starting from a modified version of LeCun's Lagrangian approach to backpropagation, we derive a slightly altered variant of dual propagation, which is robust to asymmetric nudging.
Submission Number: 30
Loading