Caveats of neural persistence in deep neural networks

Published: 18 Jun 2023, Last Modified: 04 Jul 2023TAGML2023 PosterEveryoneRevisions
Keywords: topological data analysis, analysis of neural networks
Abstract: Neural Persistence is a prominent measure for quantifying neural network complexity, proposed in the emerging field of topological data analysis in deep learning. In this work, however, we find both theoretically and empirically that the variance of network weights and spatial concentration of large weights are the main factors that impact neural persistence. First, we prove tighter bounds on neural persistence that motivate this claim theoretically. Then, we confirm that our interpretation holds in practise by calculating neural persistence for synthetic weight matrices and for trained deep neural networks. This raises the question if the benefits of neural persistence can be achieved by simpler means, since already calculating 0-order persistent homology for large matrices is costly.
Type Of Submission: Extended Abstract (4 pages, non-archival)
Submission Number: 73
Loading