Information-theoretic Neural Decoding Reproduces Several Laws of Human BehaviorDownload PDF

Published: 21 Nov 2022, Last Modified: 05 May 2023InfoCog @ NeurIPS 2022 OralReaders: Everyone
Keywords: information theory, response time, cognition, bayesian, neural decoding
TL;DR: Determining what is being encoded in a neural rate code reproduces timing and accuracy patterns that match human behavior.
Abstract: Features of tasks and environments are often represented in the brain by neural firing rates. Representations must be decoded to enable downstream actions, and decoding takes time. We describe a toy model with a Poisson process encoder and an ideal observer Bayesian decoder, and show the decoding of rate-coded signals reproduces classic patterns of response time and accuracy observed in humans, including the Hick-Hyman Law, the Power Law of Learning, speed-accuracy trade-offs, and response times matching lognormal distributions. The decoder is equipped with a codebook, a prior distribution over signals, and an entropy stopping threshold. We argue that historical concerns of the applicability of such information-theoretic tools to neural and behavioral data arises from a confusion about the application of discrete-time coding techniques to continuous-time signals.
In-person Presentation: yes
0 Replies