Doing More with Less: Computational Role of Information Structure in Neural Networks based on Entropy Maximization

Published: 10 Oct 2024, Last Modified: 20 Nov 2024NeuroAI @ NeurIPS 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: neuro-inspired AI, Neuro-inspired Computations, entropy maximization, Neuro-inspired reasoning and decision-making, Cognitive functions in AI
TL;DR: we use a method that changes information structure in neurons for maximazing information capacity and detecting hierarchical patterns
Abstract: We propose a bio-inspired concept based on the maximization of entropy in neural networks for memory storage and high-order cognitive skills. We emphasize the role of information structure to cut into smaller pieces high resolution inputs into extremely low resolution neurons. Despite the unreliability of neurons due to intrinsic noise and limitations, their interaction allows error-free reconstruction. In particular, we show that the necessary number of neurons for reconstruction grows linearly while the resolution of the input grows exponentially. Playing with the information structure of neurons, we can make them sensitive to symbolic information in signals, like hierarchical binary trees or the relative order of elements in sequences. These features are a hallmark of symbolic systems and of higher-order cognitive skills.
Submission Number: 51
Loading