Keywords: Contrastive Hebbian learning, anti-symmetric weights, Hopfield model, Bio-inspired learning
TL;DR: We propose a framework based around two-state neurons, which permits training feedforward, symmetric Hopfield models and skew-symmetric Hopfield models. The skew-symmetric Hopfield setting leads to some interesting robustness properties.
Abstract: From electrical to biological circuits, feedback plays a critical role in amplifying, dampening and stabilizing signals. In local activity difference based alternatives to backpropagation, feedback connections are used to propagate learning signals in deep neural networks. We propose a saddle-point based framework using dyadic (two-state) neurons for training a family of parameterized models, which include
the symmetric Hopfield model, pure feedforward networks and a less explored skew-symmetric Hopfield variant. The resulting learning method reduces to equilibrium propagation (EP) for symmetric Hopfield models and to dual propagation (DP) for feedforward networks, while the skew-symmetric Hopfield setting yields a new method with desirable robustness properties. Experimentally we demonstrate that the new skew-symmetric Hopfield model performs on par with EP and DP in terms of the resulting model predictive performance, while exhibiting en-
hanced robustness to input changes and strong feedback and is less inclined to neural saturation. We identify the fundamentally different types of feedback signals propagated in each model as the main cause of differences in robustness and saturation.
Submission Number: 52
Loading