Biologically-inspired adaptive learning in the Hopfield-network based self-optimization model

Published: 27 Oct 2023, Last Modified: 26 Nov 2023AMHN23 PosterEveryoneRevisionsBibTeX
Keywords: Hopfield network, adaptive learning, metaplasticity, homeostasis, associative memory
TL;DR: Computationally implementing mechanisms used by living systems to learn and adapt to their environments could improve the performance of associative memory networks, and potentially solve some of the dilemma plaguing deep learning models.
Abstract: A significant portion of the recent growth of artificial intelligence can be attributed to the development of deep learning systems, going hand in hand with the accumulation of Big Data. It therefore makes sense that most often, these systems are based on supervised or reinforcement learning using massive datasets, and reward or error-based rules for training. Though these techniques have achieved impressive levels of accuracy and functionality, rivaling human cognition in some areas, they seem to work very differently from living systems that can learn, make associations and adapt with very sparse data, efficient use of energy and comparatively minimal training iterations. In the world of machine learning, Hopfield networks, with an architecture that allows for unsupervised learning, an associative memory, scaling, and modularity, offer an alternative way of looking at artificial intelligence, that has the potential to hew closer to biological forms of learning. This work distills some mechanisms of adaptation in biological systems, including metaplasticity, homeostasis, and inhibition, and proposes ways in which these features can be incorporated into Hopfield networks through adjustments to the learning rate, modularity, and activation rule. The overall aim is to develop deep learning tools that can recapitulate the advantages of biological systems, and to have a computational method that can plausibly model a wide range of living and adaptive systems of varying levels of complexity.
Submission Number: 46
Loading