Soft Matching Distance: A metric on neural representations that captures single-neuron tuning

Published: 02 Nov 2023, Last Modified: 18 Dec 2023UniReps OralEveryoneRevisionsBibTeX
Supplementary Material: pdf
Keywords: representational similarity, deep neural networks, optimal transport, CKA, RSA, neural tuning, interpretability
Abstract: Common measures of neural representational (dis)similarity are designed to be insensitive to rotations and reflections of the neural activation space. Motivated by the premise that the tuning of individual units may be important, there has been recent interest in developing stricter notions of representational (dis)similarity that require neurons to be individually matched across networks. When two networks have the same size (i.e. same number of neurons), a distance metric can be formulated by optimizing over neuron index permutations to maximize tuning curve alignment. However, it is not clear how to generalize this metric to measure distances between networks with different sizes. Here, we leverage a connection to optimal transport theory to derive a natural generalization based on ``soft'' permutations. The resulting metric is symmetric, satisfies the triangle inequality, and can be interpreted as a Wasserstein distance between two empirical distributions. Further, our proposed metric avoids counter-intuitive outcomes suffered by alternative approaches, and captures complementary geometric insights into neural representations that are entirely missed by rotation-invariant metrics.
Track: Proceedings Track
Submission Number: 31
Loading