Quantifying the Variability Collapse of Neural Networks

16 May 2023OpenReview Archive Direct UploadReaders: Everyone
Abstract: It has been observed that the transferability of neural networks has a positive correlation with the in-class variation of the last layer features. The recently discovered Neural Collapse (NC) phenomenon provides a new perspective of understanding the geometry of the last layer features of neural networks. In this paper, we propose a new metric, named VC, to characterize the variability collapse in the NC phenomenon. The metric VC is intrinsically related to the linear probing loss on the last layer features. Moreover, it enjoys desired theoretical and empirical properties, including invariance under invertible linear transformations and numerical stability, that distinguishes it from previous metrics. Our extensive experiments verify that VC is indicative of the variability collapse and the transferability of the pretrained neural networks.
0 Replies

Loading