Dyadic Learning in Asymmetric Hopfield Networks
Keywords: Hopfield Network, Energy-based Models, Energy-based learning, Dual Propagation, Equilibrium Propagation, Dyadic Learning, contrastive Hebbian learning
TL;DR: We apply Dyadic learning, a generalization of dual-propagation to the training of asymmetric, skew-symmetric, symmetric and feedforward convolutional Hopfield networks, on a cifar10 classification and FashionMNIST denoising.
Abstract: Dual propagation is a local learning algorithm that treats neurons as simple two-compartment structures (dyads), encoding errors as their internal difference and predictions as their mean. Originally limited to feedforward (lower-triangular) models, a recent generalization, dyadic learning, extends to networks with arbitrary connectivity. Here we show for the first time that such models can be effectively trained on Cifar-10, and exhibit varied benefits and drawbacks depending on the structure of the weight matrix. In particular, symmetric, skew-symmetric, feedforward and general asymmetric convolutional networks are assessed in both classification and denoising settings. We observe that the skew-symmetric and asymmetric models perform the best on the denoising task and perform competitively in the classification task.
Submission Number: 44
Loading