Self-Organizing Resonant Network

ICLR 2026 Conference Submission25477 Authors

20 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Continual Learning, Self-Organizing Networks, Hebbian Learning, Structural Plasticity, Online Learning, Representation Learning
TL;DR: A novel, non-backpropagation learning paradigm where a network self-organizes by dynamically creating neurons for novel concepts and learning their associations via local rules.
Abstract: We introduce the Self-Organizing Resonant Network (SORN), a novel learning paradigm that operates without backpropagation. To address core challenges in representation quality, learning stability, and adaptability faced by existing continual learning models, SORN operates within a robust feature space encoded online. Its learning process is driven by two tightly coupled, biologically-inspired plasticity principles: (1) Novelty-Gated Structural Plasticity: The system dynamically creates a new neural prototype only when an input cannot be adequately represented by existing knowledge (resonators), a mechanism analogous to a self-growing vector-quantized codebook. (2) Stable Hebbian Synaptic Plasticity: By incorporating Hebbian variants with normalization and homeostatic mechanisms, the network's association matrix stably learns sparse inter-concept correlations, effectively circumventing weight explosion and saturation issues. We theoretically demonstrate the framework's computational efficiency and convergence. Extensive experiments on standard continual learning benchmarks and unbounded data streams show that SORN not only surpasses mainstream methods in catastrophic forgetting resistance and accuracy, but also exhibits superior autonomous concept formation and stable adaptation when handling continuous, non-stationary environments.
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 25477
Loading