Disentangling Recurrent Neural Dynamics with Stochastic Representational Geometry

ICLR 2024 Workshop Re-Align Submission42 Authors

Published: 02 Mar 2024, Last Modified: 03 May 2024ICLR 2024 Workshop Re-Align ContributedTalkEveryoneRevisionsBibTeXCC BY 4.0
Track: short paper (up to 5 pages)
Keywords: neural representations, dynamics, recurrent neural networks, shape metrics
TL;DR: Representational similarity metrics for comparing noisy neural responses to static stimuli can be used to distinguish recurrent neural dynamics.
Abstract: Uncovering and comparing the dynamical mechanisms that support neural processing remains a key challenge in the analysis of biological and artificial neural systems. However, measures of representational (dis)similarity in neural systems often assume that neural responses are static in time. Here, we show that stochastic shape metrics (Duong et al., 2023), which were developed to compare noisy neural responses to static inputs and lack an explicit notion of temporal structure, are well equipped to compare noisy dynamics. In two examples, we use stochastic shape metrics, which interpolates between comparing mean trajectories and second-order fluctuations about mean trajectories, to disentangle recurrent versus external contributions to noisy dynamics.
Anonymization: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and identifying URLs.
Submission Number: 42
Loading