Brain2Model Learning: Training sensory and decision models with human neural activity as a teacher
Keywords: brain-guided learning, single neuron recordings, electrophysiology, EEG, VAE, RNN, decision-making, working memory, vision
TL;DR: We improve neural network training in memory, decision-making, and sensory processing by guiding learning with human neural activity
Abstract: Cognitive neuroscience shows that the human brain creates low-dimensional, abstract representations for efficient sensorimotor coding. Importantly, the brain can learn these representations with significantly fewer data points and less computational power than artificial models require. We introduce Brain2Model Learning (B2M), a framework where neural activity from human sensory and decision-making guides the training of artificial neural networks, via contrastive learning or latent regression. We provide a proof-of-concept for B2M in memory-based decision-making with a recurrent neural network and scene reconstruction for autonomous driving with a variational autoencoder. Our results show that student networks benefiting from brain-derived guidance can either converge faster, achieve higher predictive accuracy, or both, compared to networks trained in isolation. This indicates that the brain's representations can be useful for artificial learners, facilitating efficient learning of sensorimotor representations, which would be costly or slow through purely artificial training.
Submission Number: 20
Loading