Rotation-invariant clustering of functional cell types in primary visual cortex

Anonymous

Sep 25, 2019 ICLR 2020 Conference Blind Submission readers: everyone Show Bibtex
  • TL;DR: We classify mouse V1 neurons into putative functional cell types based on their representations in a CNN predicting neural responses
  • Abstract: Similar to a convolutional neural network (CNN), the mammalian retina encodes visual information into several dozen nonlinear feature maps, each formed by one ganglion cell type that tiles the visual space in an approximately shift-equivariant manner. Whether such organization into distinct cell types is maintained at the level of cortical image processing is an open question. Predictive models building upon convolutional features have been shown to provide state-of-the-art performance, and have recently been extended to include rotation equivariance in order to account for the orientation selectivity of V1 neurons. However, generally no direct correspondence between CNN feature maps and groups of individual neurons emerges in these models, thus rendering it an open question whether V1 neurons form distinct functional clusters. Here we build upon the rotation-equivariant representation of a CNN-based V1 model and propose a methodology for clustering the representations of neurons in this model to find functional cell types independent of preferred orientations of the neurons. We apply this method to a dataset of 6000 neurons and provide evidence that discrete functional cell types exist in V1. By visualizing the preferred stimuli of these clusters, we highlight the range of non-linear computations executed by V1 neurons.
  • Keywords: computational neuroscience, neural system identification, functional cell types, deep learning, rotational equivariance
0 Replies

Loading