Layer-skipping connections facilitate training of layered networks using equilibrium propagation

Published: 01 Jan 2020, Last Modified: 25 Jan 2025ICONS 2020EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Equilibrium propagation is a learning framework that marks a step forward in the search for a biologically-plausible implementation of deep learning, and is appealing for implementation in neuromorphic analog hardware. However, previous implementations on layered networks encountered a vanishing gradient problem that has not yet been solved in a simple, biologically-plausible way. In this paper, we demonstrate that the vanishing gradient problem can be overcome by replacing some of a layered network’s connections with random layer-skipping connections. This approach could be conveniently implemented in neuromorphic analog hardware, and is biologically-plausible.
Loading