Seeing the forest and the tree: Building representations of both individual and collective dynamics with transformersDownload PDF

Published: 31 Oct 2022, Last Modified: 15 Oct 2022NeurIPS 2022 AcceptReaders: Everyone
Keywords: Transformers, population dynamics, multi-variate time-series, neural activity, many-body systems
Abstract: Complex time-varying systems are often studied by abstracting away from the dynamics of individual components to build a model of the population-level dynamics from the start. However, when building a population-level description, it can be easy to lose sight of each individual and how they contribute to the larger picture. In this paper, we present a novel transformer architecture for learning from time-varying data that builds descriptions of both the individual as well as the collective population dynamics. Rather than combining all of our data into our model at the onset, we develop a separable architecture that operates on individual time-series first before passing them forward; this induces a permutation-invariance property and can be used to transfer across systems of different size and order. After demonstrating that our model can be applied to successfully recover complex interactions and dynamics in many-body systems, we apply our approach to populations of neurons in the nervous system. On neural activity datasets, we show that our model not only yields robust decoding performance, but also provides impressive performance in transfer across recordings of different animals without any neuron-level correspondence. By enabling flexible pre-training that can be transferred to neural recordings of different size and order, our work provides a first step towards creating a foundation model for neural decoding.
TL;DR: We present a novel transformer architecture for learning from time-varying data by building descriptions of both the individual as well as the collective population dynamics.
Supplementary Material: pdf
15 Replies

Loading