Keywords: Hopfield neural network;Rectified Update Rule;Hamming distance;Convergence dynamics;Cyclic behavior;Interpretability
TL;DR: This paper proposes Rectified update rule and Hamming - Distance - Aware Rectified update rules for Hopfield neural networks, analyzes their dynamics when memorizing one or two messages, and the theoretical results are validated through simulations.
Abstract: The Hopfield Neural Network (HNN) comprises \\(N\\) binary neurons, yielding a state space of size \\(2^N\\). Traditional update rules leave a neuron's next state unspecified when its input summation is zero, leading to symmetry-breaking artifacts and spurious cycles. To remedy this, we introduce the **Rectified Update Rule**, which retains each neuron's prior state in such tie scenarios, thereby restoring symmetry and ensuring stable convergence. Building upon this, we reformulate the **Hamming-Distance-Aware Rectified (HDAR) Update Rule**, considering the Hamming distance when memorizing two messages. This rule preserves full symmetry among the two memories and their negations and yields a complete taxonomy of dynamic regimes: convergence, self-cycle, hetero-cycle, and symmetric cycle. Importantly, we encapsulate these dynamics in **two central theorems**—one characterizing single-message behavior and another for dual-messages regimes—with full proofs in Appendix. From these theorems, we derive corollaries that precisely quantify the counts and conditions of convergent versus cyclic states as functions of the network size and the Hamming distance. Extensive simulations, spanning exhaustive enumeration and Monte Carlo sampling, confirm all theoretical calculations.
Primary Area: learning theory
Submission Number: 4469
Loading