A Study of Biologically Plausible Neural Network: The Role and Interactions of Brain-Inspired Mechanisms in Continual Learning

Published: 09 Apr 2023, Last Modified: 17 Sept 2024Accepted by TMLREveryoneRevisionsBibTeXCC BY 4.0
Event Certifications: lifelong-ml.cc/CoLLAs/2023/Journal_Track
Abstract: Humans excel at continually acquiring, consolidating, and retaining information from an ever-changing environment, whereas artificial neural networks (ANNs) exhibit catastrophic forgetting. There are considerable differences in the complexity of synapses, the processing of information, and the learning mechanisms in biological neural networks and their artificial counterparts, which may explain the mismatch in performance. We consider a biologically plausible framework that constitutes separate populations of exclusively excitatory and inhibitory neurons that adhere to Dale's principle, and the excitatory pyramidal neurons are augmented with dendritic-like structures for context-dependent processing of stimuli. We then conduct a comprehensive study on the role and interactions of different mechanisms inspired by the brain, including sparse non-overlapping representations, Hebbian learning, synaptic consolidation, and replay of past activations that accompanied the learning event. Our study suggests that the employing of multiple complementary mechanisms in a biologically plausible architecture, similar to the brain, may be effective in enabling continual learning in ANNs. \footnote{We will make the code available upon acceptance.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: Incorporated the suggested changes in the manuscript by the area chair - Defined Heterogeneous Dropout in the contributions. - Added reference to Table 6 in Table 1 caption. - Added the descriptions of acronyms in Table 1. - Added the definitions of Forward Transfer and Forgetting in Section A.4. - Toned down and qualified the claims in Section 4.1, Section 4.3, and in the Discussion (in the discussion of the quality of context signal for Permuted-MNIST and Rot-MNIST, added a reference to the performance degradation in Rot-MNIST over standard ANN)
Video: https://www.youtube.com/watch?v=xh2iyEwLnSg&ab_channel=NeurAI
Assigned Action Editor: ~Josh_Merel1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 810
Loading