Data Free Metrics Are Not Reparameterisation Invariant Under the Critical and Robust Layer Phenomena

Published: 09 Jun 2025, Last Modified: 09 Jun 2025HiLD at ICML 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Random Matrix Theory, Critical and Robust Layers, Data-Free Methods
TL;DR: We explore whether data-free metrics are reparameterisation invariant under the critical and robust layer phenomena and find that they lack predictive capacity in this setting.
Abstract: Data-free methods for analysing and understanding the layers of neural networks have offered many metrics for quantifying notions of ``strong" versus ``weak" layers, with the promise of increased interpretability. We examine how robust data-free metrics are under random control conditions of critical and robust layers. Contrary to the literature, we find counter-examples that provide counter-evidence to the efficacy of data-free methods. We show that data-free metrics are not reparameterisation invariant in these conditions and lose predictive capacity across correlation measures, RMSE, Person Coefficient and Kendall's Tau measure. Thus, we argue that to understand neural networks fundamentally, we must rigorously analyse the interactions between data, weights, and resulting functions that contribute to their outputs -- contrary to traditional Random Matrix Theory perspectives.
Student Paper: Yes
Submission Number: 113
Loading