Keywords: Bio-inspired learning, Contrastive Hebbian learning, Hopfield models
TL;DR: We present a framework for training both feedforward and symmetric Hopfield models and skew-symmetric Hopfield models. The skew-symmetric Hopfield setting has some neat properties in terms of robustness to perturbations and neural saturation.
Abstract: From electrical to biological circuits, feedback plays a critical role in amplifying, dampening and stabilizing signals. In local activity difference based alternatives to backpropagation, feedback connections are used to propagate learning signals in deep neural networks. We propose a saddle-point based framework using dyadic (two-state) neurons for training a family of parameterized models, which include the symmetric Hopfield model, pure feedforward networks and a less explored skew-symmetric Hopfield variant. The resulting learning method reduces to equilibrium propagation (EP) for symmetric Hopfield models and to dual propagation (DP) for feedforward networks, while the skew-symmetric Hopfield setting yields a new method with desirable robustness properties. Experimentally we demonstrate that the new skew-symmetric Hopfield model performs on par with EP and DP in terms of the resulting model predictive performance, while exhibiting en-
hanced robustness to input changes and strong feedback and is less inclined to neural saturation. We identify the fundamentally different types of feedback signals propagated in each model as the main cause of differences in robustness and saturation.
Submission Number: 105
Loading