Keywords: $\textit{C. elegans}$, scaling properties, neural dynamics, self-supervised prediction, ANNs
TL;DR: In $\textit{C. elegans}$, we explored how self-supervised models' accuracy in predicting neural activity is influenced by data volume and model complexity, offering insights for neuroscience and biologically-inspired AI.
Abstract: The nematode worm $\textit{C. elegans}$ enables straightforward optical measurement of neural activity, presenting a unique platform for exploring intrinsic neural dynamics. This paper investigates the scaling properties essential for self-supervised neural activity prediction based on past neural data, omitting behavioral aspects. Specifically, we investigate how predictive accuracy, quantified by the mean squared error (MSE), scales with the amount of training data, considering variables such as the number of neurons recorded, recording duration, and diversity of datasets. We also examine the relationship between these scaling properties and various parameters of artificial neural network models (ANNs), including size, architecture, and hyperparameters. Employing the nervous system of $\textit{C. elegans}$ as an experimental platform, we elucidate the critical influence of data volume and model complexity in self-supervised neural prediction, demonstrating a logarithmic decrease in the MSE with an increase in the amount of training data, consistent across diverse datasets. Additionally, we observe nonlinear changes in MSE as the size of the ANN model varies. These findings emphasize the need for enhanced high-throughput tools for extended imaging of entire mesoscale nervous systems to acquire sufficient data for developing highly accurate ANN models of neural dynamics, with significant implications for systems neuroscience and biologically-inspired AI.
Primary Area: applications to neuroscience & cognitive science
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3075
Loading