Structural Learning in Artificial Neural Networks: A Neural Operator Perspective
Abstract: Over the history of Artificial Neural Networks (ANNs), only a minority of algorithms integrate structural changes of the network architecture into the learning process. Modern neuroscience has demonstrated that biological learning is largely structural, with mechanisms such as synaptogenesis and neurogenesis present in adult brains and considered important for learning. Despite this history of artificial methods and biological inspiration, and furthermore the recent resurgence of neural methods in deep learning, relatively few current ANN methods include structural changes in learning compared to those that only adjust synaptic weights during the training process. We aim to draw connections between different approaches of structural learning that have similar abstractions in order to encourage collaboration and development. In this review, we provide a survey on structural learning methods in deep ANNs, including a new neural operator framework from a cellular neuroscience context and perspective aimed at motivating research on this challenging topic. We then provide an overview of ANN methods which include structural changes within the neural operator framework in the learning process, characterizing each neural operator in detail and drawing connections to their biological counterparts. Finally, we present overarching trends in how these operators are implemented and discuss the open challenges in structural learning in ANNs.
Certifications: Survey Certification
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Length: Long submission (more than 12 pages of main content)
Assigned Action Editor: ~Blake_Aaron_Richards1
Submission Number: 78